Thank you for the more detailed run down. I would set it against two other things, though. One, that for someone who is suicidal or similar, and can’t face or doesn’t know how to find a person to talk to, those beginning interactions of generic therapy advice might (I imagine; I’m not speaking from experience here) do better than nothing.
From that, secondly, more general about AI. Where I’ve tried it it’s good with things people have already written lots about. E.g. a programming feature where people have already asked the question a hundred different ways on stack overflow. Not so good with new things - it’ll make up what its training data lacks. The human condition is as old as humans. Sure, there’s some new and refined approaches, and values and worldviews change over the generations, but old good advice is still good advice. I can imagine in certain ways therapy is an area where AI would be unexpectedly good…
…Notwithstanding your point, which I think is quite right. And as the conversation goes on the risk gets higher and higher. I, too, worry about how people might get hurt.
Sektor@lemmy.world 3 weeks ago
Does gen AI say you you are worthless, you are ugly, you are the reason your parents devorced, you should kill yourself, you should doomscroll social media?
Krauerking@lemy.lol 3 weeks ago
Probably not but I bet if you said it was your grandmas birthday you could get it to say most of that.
And hey sometimes it’s the objective truth that it doesn’t know.
Personally, I know I am the reason my parents are divorced. My incubator nicknamed me the “divorce baby” until she could come up with other worse names, but I wear it with pride. They were miserable POSs together and now at least my dad is doing better and my incubator has to spend a lot more effort scamming people.
Truths are what they are. But don’t fall for the lies that your brain or robot chatbot tell you.