Highly recommend Eddy Burbacks Video about the topic
Comment on Father sues Google, claiming Gemini chatbot drove son into fatal delusion
man_wtfhappenedtoyou@lemmy.world 2 weeks ago
How do you even get these chat bots to start telling you shit like this? Is it just from having a conversation for too long in the same chat window or something? I don’t understand how this keeps happening.
SkaveRat@discuss.tchncs.de 2 weeks ago
throws_lemy@reddthat.com 2 weeks ago
This could happen to anyone including people without having mental issues, simply by having long conversations with AI.
Her husband wanted to use ChatGPT to create sustainable housing. Then it took over his life.
sudo@lemmy.today 2 weeks ago
So it sounds like he was in fact not ‘great’
XLE@piefed.social 2 weeks ago
In the same way that homelessness correlates to drug addiction. There are many cases where a person becomes homeless, and then becomes addicted to drugs. You could, but probably shouldn’t, say that the state of homelessness just proved they had addiction issues.
echodot@feddit.uk 2 weeks ago
Ok but walk it back a bit, why did they become homeless?
If somebody is completely 100% mentally healthy I can’t see how an AI can convince them to kill themselves any more than another person could convince them to kill themselves. Only vulnerable people join cults, because it’s difficult to pray on people who have proper defences.
I’m still not convinced that the AI isn’t just triggering some underlying mental condition that other people in their lives are just not aware of or not willing to accept.