No, but I was specifically talking about the glue and pizza example
Are you trying to make us believe that AI doesn’t hallucinate?
Electricd@lemmybefree.net 1 day ago
Are you trying to make us believe that AI doesn’t hallucinate?
No, but I was specifically talking about the glue and pizza example
oascany@lemmy.world 1 day ago
It doesn’t, it generates incorrect information. This is because AI doesn’t think or dream, it’s a generative technology that outputs information based on whatever went in. It can’t hallucinate because it can’t think or feel.
Blue_Morpho@lemmy.world 1 day ago
Hallucinate is the word that has been assigned to what you described. When you don’t assign additional emotional baggage to the word, hallucinate is a reasonable word to pick to decribe when an llm follows a chain of words that have internal correlation but no basis in external reality.
oascany@lemmy.world 1 day ago
Trying to isolate out “emotional baggage” is not how language works. A term means something and applies somewhere. Generative models do not have the capacity to hallucinate. If you need to apply a human term to a non-human technology that pretends to be human, you might want to use the term “confabulate” because hallucination is a response to stimulus while confabulation is, in simple terms, bullshitting.
Blue_Morpho@lemmy.world 1 day ago
Words are redefined all the time. Kilo should mean 1000. It was the international standard definition for 150 years. But now with computers it means 1024.
Confabulation would have been a better choice. But people have chosen hallucinate.