OpenAI knowingly allowing its service to be used as a therapist most certainly makes them liable. They are toying with people’s lives with an untested and unproven product.
This kid was poking no one and didn’t get his ass beat, he is dead.
Comment on Teen killed himself after ‘months of encouragement from ChatGPT’, lawsuit claims
chrischryse@lemmy.world 2 days ago
OpenAI shouldn’t be responsible. The kid was probing ChatGPT with specifics. It’s like poking someone who repeatedly told you to stop and your family getting mad at the person for kicking your ass bad.
So i don’t feel bad, plus people are using this as their own therapist if you aren’t gonna get actual help and want to rely on a bot then good luck.
OpenAI knowingly allowing its service to be used as a therapist most certainly makes them liable. They are toying with people’s lives with an untested and unproven product.
This kid was poking no one and didn’t get his ass beat, he is dead.
That’s like saying “we me is knowingly acting like everyone’s doctor”, ChatGPT is a tool and you need to remember it’s a bot that doesn’t understand a lot or show emotion.
The kid also was telling ChatGPT “oh this hanging is for a character” along with other ways to trick it. Sure I guess OpenAi should be slightly responsible, but not as responsible for how people use it, if you’re going to not bother with real help I ain’t showing sympathy I get suicide sucks but what sucks more is putting your loved ones through that trajedy
If a company designs a flawed tool that harms people they are responsible. Why are you trying so hard to not make them responsible.
The last part about suicide is pretty tone death. I have lost multiple people in my life to suicide.
themachinestops@lemmy.dbzer0.com 2 days ago
The problem here is the kid if I am not wrong asked ChatGPT if he should talk to his family about his feelings. ChatGPT said no, which in my opinion makes it at fault.