Advertise drugs to them perhaps, or somd sort of taking advantage. If this sort of data is the hands of an ad network that is
Comment on OpenAI says over a million people talk to ChatGPT about suicide weekly
whiwake@sh.itjust.works 1 day agoStill, what are they gonna do to a million suicidal people besides ignore them entirely
Scolding7300@lemmy.world 1 day ago
whiwake@sh.itjust.works 1 day ago
It’s never the drugs I want though :(
snooggums@piefed.world 23 hours ago
No, no. They want repeat customers!
Scolding7300@lemmy.world 15 hours ago
Unless they sell Lifetime deals. Probably cheap on the warranty/support side. If the drug doesn’t work 🤔
Bougie_Birdie@piefed.blahaj.zone 1 day ago
My pet theory: Radicalize the disenfranchised to incite domestic terrorism and further OpenAI’s political goals.
whiwake@sh.itjust.works 21 hours ago
What are their political goals?
Bougie_Birdie@piefed.blahaj.zone 20 hours ago
Tax breaks for tech bros
whiwake@sh.itjust.works 19 hours ago
I think total control over the country might be the goal, and it’s a bit more than a tax break.
wewbull@feddit.uk 15 hours ago
Strap explosives to their chests and send them to thier competitors?
turdcollector69@lemmy.world 11 hours ago
Convince each one that they alone are the chosen one to assassinate grok and that this mission is all that matters to give their lives meaning.
whiwake@sh.itjust.works 14 hours ago
Take that Grok!!
WhatAmLemmy@lemmy.world 1 day ago
Well, AI therapy is more likely to harm their mental health, up to encouraging suicide (as certain cases have already shown).
FosterMolasses@leminal.space 1 day ago
There’s evidence that a lot of suicide hotlines can be just as bad. You hear awful stories all the time of overwhelmed or fed up operators taking it out on the caller. There’s some real evil people out there. And not everyone has access to a dedicated therapist who wants to help.
whiwake@sh.itjust.works 1 day ago
Real therapy isn’t always better. At least there you can get drugs. But neither are a guarantee to make life better—and for a lot of them, life isn’t going to get better anyway.
kami@lemmy.dbzer0.com 21 hours ago
Are you comparing a professional to a text generator?
whiwake@sh.itjust.works 21 hours ago
Have you ever had ineffective professional therapy?
CatsPajamas@lemmy.dbzer0.com 1 day ago
Real therapy is definitely better than an AI. That said, AIs will never encourage self harm without significant gaming.
whiwake@sh.itjust.works 21 hours ago
AI “therapy” can be very effective without the gaming, but the problem is most people want it to tell them what they want to hear. Real therapy is not “fun” because a therapist will challenge you on your bullshit and not let you shape the conversation.
I find it does a pretty good job with pro and con lists, listing out several options, and taking situations and reframing them. I have found it very useful, but I have learned not to manipulate it or its advice just becomes me convincing myself of a thing.
triptrapper@lemmy.world 22 hours ago
I agree, and to the comment above you, it’s not because it’s guaranteed to reduce symptoms. There are many ways that talking with another person is good for us.
Cybersteel@lemmy.world 1 day ago
Suicide is big business. There’s infrastructure readily available to reap financial rewards from the activity, atleast in the US.
atmorous@lemmy.world 1 day ago
More so from corporate proprietary ones no? At least I hope that’s the only cases. The open source ones suggest really useful ways proprietary do not. Now I dont rely on open source AI but they are definitely better
SSUPII@sopuli.xyz 1 day ago
The corporate models are actually much better at it due to having heavy filtering built in. The fact that a model generally encourages self arm is just a lie that you can prove right now by pretending to be suicidal on ChatGPT. You will see it will adamantly push you to seek help.
The filters and safety nets can be bypassed no matter how hard you make them, and it is the reason why we got some unfortunate news.