Comment on AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
Ulrich@feddit.org 9 hours ago“Lie” implies intent. Do you have evidence of intent?
Comment on AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
Ulrich@feddit.org 9 hours ago“Lie” implies intent. Do you have evidence of intent?
XLE@piefed.social 8 hours ago
Maybe the technical term is “bullshit” because it returns something meant to appease the user regardless of truth value
But “lie” is definitely a less inaccurate interpretation than “hallucinate,” because a “hallucination” implies the generation of something not there, despite the fact the data is equally present for things deemed non-hallucinations.
GreenBeard@lemmy.ca 7 hours ago
I would argue that hallucinate doesn’t go nearly far enough, given that it will double down and defend them. I would call it delusions.