Pretty engrained vocabulary at this point. Lies implies intent. I would have preferred “errors”
Also, for the record, this is the most dystopian headline I’ve come across to date.
Pretty engrained vocabulary at this point. Lies implies intent. I would have preferred “errors”
Also, for the record, this is the most dystopian headline I’ve come across to date.
dohpaz42@lemmy.world 10 months ago
If a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?
JuxtaposedJaguar@lemmy.ml 10 months ago
If you train a parrot to say “I can do calculus!” and then you ask it if it can do calculus, it’ll say “I can do calculus!”. It can’t actually do calculus, so would you say the parrot is lying?
wizardbeard@lemmy.dbzer0.com 10 months ago
No, but the parrot isn’t hallucinating either.
ramirezmike@programming.dev 10 months ago
that’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”
prole@lemmy.blahaj.zone 10 months ago
“Fabrications” seems ok to me
RadicalEagle@lemmy.world 10 months ago
Bullshit.
5too@lemmy.world 10 months ago
This is what I’ve been calling it. Not as a pejorative term, just descriptive. It has no concept of truth or not-truth, it just tells good-sounding stories. It’s just bullshitting.