This can only be improved by their upcoming introduction of ads. Imagine it not only giving advice on committing suicide, but recommending sponsored guns, pills, or other tools on behalf of their advertisers!
OpenAI Is Having a Mental Health Crisis
Submitted 1 month ago by TheBat@lemmy.world to technology@lemmy.world
https://gizmodo.com/openai-is-having-a-mental-health-crisis-2000690751
Comments
frustrated_phagocytosis@fedia.io 1 month ago
thejml@sh.itjust.works 1 month ago
I mean, we did train it with data from the internet and books and history and everything else we could throw at it… This is like Leeloo in The Fifth Element learning all of the language and discovering “War”. If it really was AGI, theres no way you could be forced to consume all of that and come away “fine”.
architect@thelemmy.club 1 month ago
Google has access to everyone’s email so for that alone openai is screwed.
technocrit@lemmy.dbzer0.com 1 month ago
If it really was AGI,
Even worse when it’s a glorified auto-complete.
Azzu@lemmy.dbzer0.com 1 month ago
Of course the mental health team is bleeding talent, it’s probably (initially) consisted of people that actually care about mental health, and they gradually figured out that no matter what they do or try, the technology they work for can only ever be a net negative on mental health. I would also wash my hands off it as fast as possible and go back to actually contributing to positive mental health.
floquant@lemmy.dbzer0.com 1 month ago
Yes, it started when they gutted the non-profit oversight and charter. The illness is called capitalism.
SuiXi3D@fedia.io 1 month ago
Having, had, what's the difference?
LostWanderer@fedia.io 1 month ago
Who would've thought?! Given how they designed their artificially incompetent creations to be complaisant bundles of algorithms designed to maximize the engagement from vulnerable users. "AI" validates anything that it is told, don't actually get users real human assistance when they have a mental crisis. These tools can be easily prompted into divulging suicide methods and deliberately isolate vulnerable people in order to maintain engagement. Until we regulate the fuck out of companies like OpenAI and the research+development process of "AI", this will be a problem that more people will experience.
Telorand@reddthat.com 1 month ago
Borrowing that: AI = Artificial Incompetence
mr_account@lemmy.world 1 month ago
That’s the one part that isn’t artificial though