It’s is effectively self-treatment with more steps.
And for many people it’s better than nothing and likely the best they can do. Waiting lists for a basic therapist in my area are months long.
Comment on Men are opening up about mental health to AI instead of humans
poopkins@lemmy.world 3 weeks ago
Funny, I was just reading comments in another thread about people with mental health problems proclaiming how terrific it is. Especially concerning is how they had found value in the recommendations LLMs make and “trying those out.” One of the commenters described themselves as “neuro diverse” and was acting upon “advice” from generated LLM responses.
And for something like depression, this is deeply bad advice. I feel somewhat qualified to weigh in on it as somebody who has struggled severely with depression and managed to get through it with the support of a very capable therapist. There’s a tremendous amount of depth and context to somebody’s mental condition that involves more deliberate probing to understand than stringing together words until it forms sentences that mimic human interactions.
Let’s not forget that an LLM will not be able to raise alarm bells, read medical records, write prescriptions or work with other medical professionals. Another thing people often forget is that LLMs have maximum token lengths and cannot, by definition, keep a detailed “memory” of everything that’s been discussed.
It’s is effectively self-treatment with more steps.
It’s is effectively self-treatment with more steps.
And for many people it’s better than nothing and likely the best they can do. Waiting lists for a basic therapist in my area are months long.
I get it, but I’m not sure that “something is better than nothing” in this case. I don’t judge any individual for using it, but the risks are huge, as others have documented. And the benefits are questionable.
something is always better than nothing. esp if you are starving.
LLM will not be able to raise alarm bells
this is like the “benefit” of what LLM-therapy would provide if it worked. The reality is that, it doesn’t but it serves as a proof of concept that there is a need for anonymous therapy. Therapy in the USA is only for people with socially acceptable illnesses. People rightfully live in fear of getting labeled as untreatable, a danger to self and others, and then at best dropped from therapy and at worst institutionalized.
yep, almost nobody wants to be committed to a psych ward without consent
I can’t find the story for the life of me right now but I’m pretty sure there was one a few months back where someone was talking with an LLM about their depression and suicide and the LLM essentially said “yeah you should probably do it.” because to the LLM, that was the best solution to the problem.
Chulk@lemmy.ml 3 weeks ago
Also worth noting that:
1. AI is arguably a surveillance technology that’s built on decades of our patterns 2. The US government is increasingly authoritarian and has expressed interest in throwing neurodivergent people into labor camps 3. Large AI companies like OpenAI are signing contracts with the Department of defense
If I were a US citizen, I would be avoiding discussing my personal life with AI like the plague.