Comment on How Much Do LLMs Hallucinate in Document Q&A Scenarios? A 172-Billion-Token Study Across Temperatures, Context Lengths, and Hardware Platforms [TLDR: 25%]
Son_of_Macha@lemmy.cafe 3 weeks ago
We need to stop calling it hallucinations and say what it is, ERRORS
Hallucinations of LLMs are just one class of errors, and the most dangerous one.
Other stuff like garbeled or repeating output are other errors.
cmhe@lemmy.world 3 weeks ago
Hallucinations of LLMs are just one class of errors, and the most dangerous one.
Other stuff like garbeled or repeating output are other errors.