Comment on How Much Do LLMs Hallucinate in Document Q&A Scenarios? A 172-Billion-Token Study Across Temperatures, Context Lengths, and Hardware Platforms [TLDR: 25%]
Son_of_Macha@lemmy.cafe 1 week ago
We need to stop calling it hallucinations and say what it is, ERRORS
Hallucinations of LLMs are just one class of errors, and the most dangerous one.
Other stuff like garbeled or repeating output are other errors.
cmhe@lemmy.world 1 week ago
Hallucinations of LLMs are just one class of errors, and the most dangerous one.
Other stuff like garbeled or repeating output are other errors.