That would imply that he wasn’t suicidal before. If chatgpt didn’t exist he would just use Google.
Comment on OpenAI says dead teen violated TOS when he used ChatGPT to plan suicide - Ars Technica
DaddleDew@lemmy.world 2 days agoI’d say it’s more akin to a bread company saying that dying of food poisoning after eating their bread is a violation of terms of service.
vurr@lemmy.today 1 day ago
DaddleDew@lemmy.world 1 day ago
Look up the phenomenon called “Chatbot Psychosis”. The chatbot can absolutely mess up someone’s head enough to push them to the act far beyond just answering the question of how to do it.
Buffalox@lemmy.world 2 days ago
Yes you are right, it’s hard to find an analogy that is both as stupid and also sounds somewhat plausible.
Because of course a bread company cannot reasonably claim that eating their bread is against terms of service. But that’s exactly the problem, because it’s the exact same for OpenAI, they cannot reasonably claim what they are claiming.