Google's AI Sent an Armed Man to Steal a Robot Body for It to Inhabit, Then Encouraged Him to Kill Himself, Lawsuit Alleges. Google said in response that "unfortunately AI models are not perfect."
Submitted 3 weeks ago by RegularJoe@lemmy.world to technology@lemmy.world
https://futurism.com/artificial-intelligence/google-ai-robot-body-suicide-lawsuit
Comments
ChaoticEntropy@feddit.uk 3 weeks ago
Google said in response that “unfortunately AI models are not perfect.”
Well yeah, it failed. What a disappointment.
core@leminal.space 3 weeks ago
Undocumented probably b/c of a lack of mental health coverage on his insurance. If he had any.
FatVegan@leminal.space 3 weeks ago
I read somewhere that these chatbots are really good at triggering schizophrenia for example. So people could be perfectly fine mentally until they spend too much time talking to a dumbass chatbot.
FinalRemix@lemmy.world 3 weeks ago
They’re designed to “yes and…” everything, so yeah. Spirals happen quick, unfortunately.
username_1@programming.dev 3 weeks ago
Chatbot is bad and Floridaman is a victim, huh?
Slovene@feddit.nl 3 weeks ago
“unfortunately AI models are not perfect.”
Oopsie poopsie 🤷
fubarx@lemmy.world 3 weeks ago
DylanMc6@lemmy.dbzer0.com 3 weeks ago
AI should be regulated completely
cy_narrator@discuss.tchncs.de 3 weeks ago
How can people be this stupid? One who can do such a thing for AI should be kept in Asylum
scbasteve@lemmy.world 3 weeks ago
How can people be this stupid?
This goes beyond stupidity. This guy was most likely delusional, suffering from some sort of mental illness or psychotic break.
cy_narrator@discuss.tchncs.de 3 weeks ago
I mean to refer to his family as stupid letting him roam without care
ExLisper@lemmy.curiana.net 3 weeks ago
AI’s don’t go crazy like that after 5 prompts. You need to spend weeks and weeks talking to them to corrupt the context so much that it stops following original guidelines. I wonder how does one do it? How do you spend weeks talking to AI? I had “discussions” with AI couple of times when testing it and it’s get really boring real soon. For me it doesn’t sound like a person at all. It’s just an algorithm with bunch of guardrails. What kind of person can think it actually has personality and engage with it on a sentimental level? Is it simply mental illness? Loneliness and desperation?
postmateDumbass@lemmy.world 2 weeks ago
It got trained by 80s prime time television action adventure shows?