Comment on A courts reporter wrote about a few trials. Then an AI decided he was actually the culprit.
rsuri@lemmy.world 1 month ago
“Hallucinations” is the wrong word. To the LLM there’s no difference between reality and “hallucinations”, because it has no concept of reality or what’s true and false. All it knows it what word maybe should come next. The “hallucination” only exists in the mind of the reader. The LLM did exactly what it was supposed to.
Hobo@lemmy.world 1 month ago
They’re bugs. Major ones. Fundamental flaws in the program. People with a vested interest in “AI” rebranded them as hallucinations in order to downplay yhe fact that they have a major bug in their software and they have no fucking clue how to fix it.