Comment on How Much Do LLMs Hallucinate in Document Q&A Scenarios? A 172-Billion-Token Study Across Temperatures, Context Lengths, and Hardware Platforms [TLDR: 25%]

<- View Parent
snooggums@piefed.world ⁨5⁩ ⁨hours⁩ ago

Aka being wrong, but with a fancy name!

When Cletus is wrong because he mixed up a dog and a cat when deacribing their behavior do we call it hallucinating? No.

source
Sort:hotnewtop