Huh?
Comment on Father sues Google, claiming Gemini chatbot drove son into fatal delusion
mojofrododojo@lemmy.world 4 weeks agoYeah, what was he wearing, right?
NewNewAugustEast@lemmy.zip 4 weeks ago
mojofrododojo@lemmy.world 4 weeks ago
How do we know this didn’t start off with prompts about creating a book, or asking about exciting things in life, or I don’t know what.
you’re blaming the victim. stop. why simp for one of the largest companies in the world?
jfc
NewNewAugustEast@lemmy.zip 4 weeks ago
Oh so stupid shit. Figures.
Yes I am interested in how this happened. In a murder do you not investigate it?
What the fuck.
Google can go fuck themselves.
mojofrododojo@lemmy.world 4 weeks ago
Oh so stupid shit. Figures.
ah so incel shit, victim blaming classic. if google can go fuck themselves why are you blaming the user?
XLE@piefed.social 4 weeks ago
Pretty well articulated point.
“What did the prompts say” is a synonym for “was he asking for it”
mojofrododojo@lemmy.world 4 weeks ago
I’m so sick of people blaming mentally ill (or completely sane!) individuals from being goaded into psychosis by this shitware chatbot garbage masquerading as AI.
it’s fucking software, it shouldn’t be ABLE to talk someone into suicide, much less give them a countdown (literally what gemini did here). It shouldn’t be able to goad someone into attempting to attack an airport. I can’t fathom the liability if it had succeeded, I know goog has deep pockets but fuck, this needs to stop.