Sounds like a typical response from a forum.
Google's AI chatbot tells student seeking help with homework "please die"
Submitted 1 day ago by lemmee_in@lemm.ee to technology@lemmy.zip
https://www.newsweek.com/googles-ai-chatbot-tells-student-seeking-help-homework-please-die-1986471
Comments
centipede_powder@lemmy.world 1 day ago
OpenStars@piefed.social 1 day ago
SourceHuman@lemmy.world 1 day ago
Direct link to the full conversation: gemini.google.com/share/6d141b742a13
Archive link: archive.is/eiLQy
eatthecake@lemmy.world 1 day ago
That’s more extreme than I expected.
kamenlady@lemmy.world 1 day ago
It had 2 alternative “normal” answers, both on topic and in scope of what he was asking.
It later admitted that the answer wasn’t adequate ( i wonder how and who got it to say this )
But, yeah this one answer is eerie in its complete annihilation of one human being’s worth of being alive.
“This is for you, human” makes it sound like it is sentient.
“You and only you” gave me goosebumps.
I know, it’s just gathering answers from all kinds of sources it was trained with, but hell no
allywilson@lemmy.ml 1 day ago
But yet, this is a good thing. AI only wants to target this one person. The rest of us are fine.
UniversalMonk@lemm.ee 21 hours ago
Good point!
Obi@sopuli.xyz 1 day ago
Except if it’s an empirical “you” and “human” defines the entire human race. I’m pretty sure grammatically speaking this is a valid interpretation, right?