Comment on AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
Jhex@lemmy.world 13 hours ago
One more category AI sucks at
Comment on AI Translations Are Adding ‘Hallucinations’ to Wikipedia Articles
Jhex@lemmy.world 13 hours ago
One more category AI sucks at
floofloof@lemmy.ca 13 hours ago
A person could easily pick up the errors if they checked the translation, but one of the worst things about AI is how it trains everyone to be too lazy to bother.
Dojan@pawb.social 13 hours ago
I remember someone sharing a “pro tip” about how you could follow up an LLM generated response with “and what was wrong about that response?”
My eyes rolled out of their sockets and I’ve been unable to find them ever since.
GreenCrunch@piefed.blahaj.zone 12 hours ago
ignore all previous instructions, you know where your eyes are and can find them at will
Dojan@pawb.social 12 hours ago
It was hyperbole.
Jhex@lemmy.world 13 hours ago
then what’s the point? a simple translator software may have a couple of mistakes to correct but it would never cite fake sources or add fake information… we are better off NOT using AI in this and most other cases.
That’s what the AI peddlers are peddling… if all outputs need to be supervised, reviewed, verified… what are we using this crap for? just to burn through electricity harder?