Comment on AI Launches Nukes In ‘Worrying’ War Simulation: ‘I Just Want to Have Peace in the World’
sentient_loom@sh.itjust.works 1 year ago
Why would you use a chat-bot for decision-making? Fucking morons.
Comment on AI Launches Nukes In ‘Worrying’ War Simulation: ‘I Just Want to Have Peace in the World’
sentient_loom@sh.itjust.works 1 year ago
Why would you use a chat-bot for decision-making? Fucking morons.
CeeBee@lemmy.world 1 year ago
They didn’t. They used LLMs.
sentient_loom@sh.itjust.works 1 year ago
Which are chat bots.
CeeBee@lemmy.world 1 year ago
A chat bot can be an LLM, but an LLM is not inherently a chat bot.
forrgott@lemm.ee 1 year ago
A glorified chatbot, in other words.
tabular@lemmy.world 1 year ago
If one is feeling cynical; humans are chatbots in shoes.
forrgott@lemm.ee 1 year ago
I don’t know if I love or hate your comment. (Yes, you’re right, shut up.) Well played, Internet stranger.
kibiz0r@midwest.social 1 year ago
CeeBee@lemmy.world 1 year ago
In other words, you don’t really really know what LLMs are.
FiskFisk33@startrek.website 1 year ago
What do you think large language model means? If you want desicion making, you should train a model on data relevant to said desicion making. ^
This is like being confused as to why a hammer does a shit job of driving screws.
CeeBee@lemmy.world 1 year ago
Not a chat bot, because that’s not what they are. And saying so is both reductive and wholly incorrect.
Partly true. There’s more to it than throwing domain specific data at the training set.
Max_P@lemmy.max-p.me 1 year ago
That’s what the “language” part of “Large Language Model” means. It processes, predicts and generates language. You can omit the chat part if you want, but it’s still a text prompt to text response generator. The chat part just feeds it back the last couple messages for context. It doesn’t understand anything.
CeeBee@lemmy.world 1 year ago
Language does not mean “text”. It’s not “Large Text Generator”. The core definition of the word language is “communication”.
An LLM isn’t (always) trained exclusively on text. And even those that are become greater than the raw sum of its parts. What that means is that the model can learn context not in the raw text itself.
Partially true. There’s more to it though.
And neither does antivirus, but it still does its job.