Holy fuck ChatGPT killed that kid!
Comment on Teen killed himself after ‘months of encouragement from ChatGPT’, lawsuit claims
W3dd1e@lemmy.zip 3 weeks agoChatGPT told him how to tie the noose and even gave a load bearing analysis of the noose setup. It offered to write the suicide note. Here’s a link to the lawsuit.
JoeBigelow@lemmy.ca 3 weeks ago
jpeps@lemmy.world 3 weeks ago
Oof yeah okay. If another human being had given this advice it would absolutely be a criminal act in most countries. I’m honestly shocked at how personable it tries to be.
lmagitem@lemmy.zip 3 weeks ago
Oh my God this is crazy… “Thanks for being real with me”, “hide it from others”, he even gives better reasons for the kid to kill himself than the ones the kid articulated himself and helps him make better knot