If the gun also talked to you
Comment on OpenAI says dead teen violated TOS when he used ChatGPT to plan suicide - Ars Technica
Buffalox@lemmy.world 3 weeks ago
That’s like a gun company claiming using their weapons for robbery is a violation of terms of service.
mriormro@lemmy.zip 3 weeks ago
Capricorn_Geriatric@lemmy.world 3 weeks ago
Talked you into it*
CosmoNova@lemmy.world 2 weeks ago
That‘s a company claiming companies can‘t take responsibility because they are companies and can‘t do wrong. They use this kind of defense virtually every time they get criticized. AI ruined the app for you? Sorry but that‘s progress. We can‘t afford to lag behind. Oh you can’t afford rent and are about to become homeless? Sorry but we are legally required to make our shareholders happy. Oh your son died? He should‘ve read the TOS. Can‘t afford your meds? Sorry but number must go up.
Companies are legally required to be incompatible with human society long term.
Whostosay@sh.itjust.works 3 weeks ago
Yeah this metaphor isn’t even almost there
jazzkoalapaws@ttrpg.network 2 weeks ago
That analogy is 100% accurate.
It is exactly like that.
gian@lemmy.grys.it 3 weeks ago
I would say that it is more like a software company putting in their TOS that you cannot use their software to do a specific thing(s).
Would be correct to sue the software company because a user violated the TOS ?I agree that what happened is tragic and that the answer by OpenAI is beyond stupid but in the end they are suing the owner of a technology for a uses misuse of said technology. Or should we sue also Wikipedia because someone looked up how to hang himself ?
That’s like a gun company claiming using their weapons for robbery is a violation of terms of service.
The gun company can rightfully say that what you do with your property is not their problem.
But let’s make a less controversial example: do you think you can sue a fishing rods company because I use one of their rods to whip you ?Pieisawesome@lemmy.dbzer0.com 2 weeks ago
To my legal head cannon, this boils down to if OpenAi flagged him and did nothing.
If they flagged him, then they knew about the ToS violations and did nothing, then they should be in trouble.
If they don’t know, but can demonstrate that they will take action in this situation, then, in my opinion, they are legally in the clear…
HeyThisIsntTheYMCA@lemmy.world 2 weeks ago
depends whether intent is a required factor for the state’s wrongful death statute (my state says it’s not, as wrongful death is there for criminal homicides that don’t fit the murder statute). if openai acted intentionally, recklessly, or negligently in this they’re at least partially liable. if they flagged him, it seems either intentional or reckless to me. if they didn’t, it’s negligent.
however, if the deceased used some kind of prompt injection (i don’t know the right terms, this isn’t my field) to bypass gpt’s ethical restrictions, and if understanding how to bypass gpt’s ethical restrictions is in fact esoteric, only then would i find openai was not at least negligent.
as i myself have gotten gpt to do something it’s restricted from doing, and i haven’t worked in IT since the 90s, i’m led to a single conclusion.
DaddleDew@lemmy.world 3 weeks ago
I’d say it’s more akin to a bread company saying that dying of food poisoning after eating their bread is a violation of terms of service.
Buffalox@lemmy.world 3 weeks ago
Yes you are right, it’s hard to find an analogy that is both as stupid and also sounds somewhat plausible.
Because of course a bread company cannot reasonably claim that eating their bread is against terms of service. But that’s exactly the problem, because it’s the exact same for OpenAI, they cannot reasonably claim what they are claiming.
vurr@lemmy.today 3 weeks ago
That would imply that he wasn’t suicidal before. If chatgpt didn’t exist he would just use Google.
DaddleDew@lemmy.world 3 weeks ago
Look up the phenomenon called “Chatbot Psychosis”. The chatbot can absolutely mess up someone’s head enough to push them to the act far beyond just answering the question of how to do it.