Comment on AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles

<- View Parent
XLE@piefed.social ⁨6⁩ ⁨hours⁩ ago

Maybe the technical term is “bullshit” because it returns something meant to appease the user regardless of truth value

But “lie” is definitely a less inaccurate interpretation than “hallucinate,” because a “hallucination” implies the generation of something not there, despite the fact the data is equally present for things deemed non-hallucinations.

source
Sort:hotnewtop