Google, of all companies, probably has a better psychological profile of their users than the average doctor. They even offer a public-facing option to disable ads about gambling, alcohol, or pregnancy.
Comment on Father sues Google, claiming Gemini chatbot drove son into fatal delusion
Bassman27@lemmy.world 14 hours agoSo someone who already has an underlying mental health condition diagnosed or not is at fault for their own death even if being coerced into doing it?
XLE@piefed.social 14 hours ago
TwilitSky@lemmy.world 12 hours ago
TBH, alcohol ads are INSUFFERABLE but who needs pregnancy ads blocked?
dtaylor84@lemmy.dbzer0.com 3 hours ago
Maybe those trying and failing to conceive?
XLE@piefed.social 11 hours ago
People who don’t want their family getting suspicious, perhaps. The Target Incident comes to mind.
Of course, disabling these options doesn’t mean Google stops knowing about mental or physical issues. I’m sure you know the best way to prevent that is to just avoid Google and add some together. This is probably just Google’s way of looking less creepy to the average person.
I_Has_A_Hat@lemmy.world 12 hours ago
In 1980, John Lennon was shot by a mentally ill man who was convinced to kill Lennon by reading Catcher in the Rye. If he had never read Catcher in the Rye, he most likely wouldn’t have killed John Lennon.
But it is not the fault of Catcher in the Rye. We don’t ban the book, or call the author irresponsible for writing it, because we recognize that the fault lies in the mental illness of the shooter, and that anything could have set him off.
The people who kill themselves because an AI Chatbot told them to are mentally ill. It is their mental illness that killed them, not the chatbot. You can make the claim that if it wasn’t for the chatbot, they wouldn’t have gone through with it, but again, you can say the same thing about Catcher in the Rye. Getting rid of the trigger does not remove the mental illness.
ToTheGraveMyLove@sh.itjust.works 11 hours ago
That’s a terrible argument. We dont blame the book because Catcher in the Rye didn’t have a conversation with him and tell him to kill John Lennon. That’s the difference.
TwilitSky@lemmy.world 8 hours ago
“We don’t blame the book because Catcher in the Rye didn’t have a conversation with him and tell him to kill John Lennon. That’s the difference.”
Speak for yourself, please.
ToTheGraveMyLove@sh.itjust.works 8 hours ago
Oh, you’re a dumbass huh?
NewNewAugustEast@lemmy.zip 9 hours ago
AI’s can’t have conversations any more than a book can.
dtaylor84@lemmy.dbzer0.com 3 hours ago
How is that pedantic point relevant?
SaveTheTuaHawk@lemmy.ca 10 hours ago
Berkowitz was told by his neighbors dog to kill people.
TwilitSky@lemmy.world 8 hours ago
Yeah but was that just a lie?
SaveTheTuaHawk@lemmy.ca 10 hours ago
If he had never read Catcher in the Rye, he most likely wouldn’t have killed John Lennon.
Sue Seagram’s!
ieGod@lemmy.zip 14 hours ago
It’s not the car manufacturer’s responsibility to guarantee a drunk driver doesn’t plow into others.
Vulnerable people don’t get to outsource responsibility.
Bassman27@lemmy.world 12 hours ago
Here’s the thing, there are no safeguards on who can and cannot use ai. There are safeguards to prevent death by drink driving.
Drink driving is illegal. It still happens but it’s against the law. It’s a deterrent to stop people from driving while intoxicated. I guarantee that if drunk driving were legal there would be exponentially more deaths.
Ai is being shoved down everyone’s throats on a day to day basis. There are no safeguards, even kids can use it.
Vulnerable people are victims of big tech for profit.
You argument is poor
SalamenceFury@piefed.social 14 hours ago
Here’s the thing, it’s usually normies with no history of mental illness that fall into this kind of stuff. Most of my friends and people I follow on social media who are neurodivergent did experiment with chatbots and they saw a fuckton of red flags on the manner they work and alerted everyone about it, if they didn’t hate it already for essentially stealing artistic output (which in my case was both).