“Lie” implies intent. Do you have evidence of intent?
Comment on AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
webp@mander.xyz 13 hours ago
“AI translations are adding lies to Wikipedia articles” Fixed.
Ulrich@feddit.org 11 hours ago
XLE@piefed.social 10 hours ago
Maybe the technical term is “bullshit” because it returns something meant to appease the user regardless of truth value
But “lie” is definitely a less inaccurate interpretation than “hallucinate,” because a “hallucination” implies the generation of something not there, despite the fact the data is equally present for things deemed non-hallucinations.
GreenBeard@lemmy.ca 9 hours ago
I would argue that hallucinate doesn’t go nearly far enough, given that it will double down and defend them. I would call it delusions.
XLE@piefed.social 12 hours ago
I’d like to believe 404 Media’s use of scare quotes is intentional there, but yes 100%