Comment on AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
XLE@piefed.social 6 hours agoMaybe the technical term is “bullshit” because it returns something meant to appease the user regardless of truth value
But “lie” is definitely a less inaccurate interpretation than “hallucinate,” because a “hallucination” implies the generation of something not there, despite the fact the data is equally present for things deemed non-hallucinations.
GreenBeard@lemmy.ca 5 hours ago
I would argue that hallucinate doesn’t go nearly far enough, given that it will double down and defend them. I would call it delusions.