Consider how the US handles those cases, that may actually be a broken-clock good thing. If they sent the cops to a suicidal person’s house said cops would probably kill them themselves.
Comment on OpenAI Says It's Scanning Users' ChatGPT Conversations and Reporting Content to the Police
OsrsNeedsF2P@lemmy.ml 2 weeks agoNah, not for suicide:
But in the post warning users that the company will call the authorities if they seem like they’re going to hurt someone, OpenAI also acknowledged that it is “currently not referring self-harm cases to law enforcement to respect people’s privacy given the uniquely private nature of ChatGPT interactions.”
Soup@lemmy.world 2 weeks ago
turtlesareneat@discuss.online 2 weeks ago
Oh thank god I was afraid some more kids might get talked into suicide by a fucking server
synae@lemmy.sdf.org 2 weeks ago
Oh, so only for discussing topics the authorities consider verboten