Comment on OpenAI admits AI hallucinations are mathematically inevitable, not just engineering flaws
BombOmOm@lemmy.world 6 days ago
A hallucination is something that disagrees with your active inputs (ears, eyes, etc). AIs don’t have these active inputs, all they have is the human equivalent of memories. Everything they draw up is a hallucination, literally all of it. It’s simply coincidence the hallucination matches reality.
Is it really surprising that the thing that can only create hallucinations is often wrong? That the thing that can only create hallucinations will continue to be wrong on a regular basis?
mindbleach@sh.itjust.works 5 days ago
My guy, Microsoft Encarta 97 doesn’t have senses either, and its recollection of the capital of Austria is neither coincidence nor hallucination.