UsoSaito@feddit.uk 2 days ago
It doesn’t confuse us… it annoys us with the blatant wrong information. e.g. glue is a pizza ingredient.
UsoSaito@feddit.uk 2 days ago
It doesn’t confuse us… it annoys us with the blatant wrong information. e.g. glue is a pizza ingredient.
Electricd@lemmybefree.net 2 days ago
That’s when you use 3 years old models
AngryRobot@lemmy.world 2 days ago
Are you trying to make us believe that AI doesn’t hallucinate?
oascany@lemmy.world 2 days ago
It doesn’t, it generates incorrect information. This is because AI doesn’t think or dream, it’s a generative technology that outputs information based on whatever went in. It can’t hallucinate because it can’t think or feel.
Blue_Morpho@lemmy.world 2 days ago
Hallucinate is the word that has been assigned to what you described. When you don’t assign additional emotional baggage to the word, hallucinate is a reasonable word to pick to decribe when an llm follows a chain of words that have internal correlation but no basis in external reality.
Electricd@lemmybefree.net 2 days ago
No, but I was specifically talking about the glue and pizza example