Comment on People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies
sugar_in_your_tea@sh.itjust.works 2 days agothe problem with ChatGPT in particular is that is validates the psychosis… that is very bad.
So do astrology and conspiracy theory groups on forums and other forms of social media, the main difference is whether you’re getting that validation from humans or a machine. To me, that’s a pretty unhelpful distinction, and we attack both problems the same way: early detection and treatment.
lenz@lemmy.ml 2 days ago
I think having that kind of validation at your fingertips, whenever you want, is worse. At least people, even people deep in the claws of a conspiracy, can disagree with each other. At least they know what they are saying. The AI always says what the user wants to hear and expects to hear. Though I can see how that distinction may matter little to some, I just think ChatGPT has advantages that are worse than what a forum could do.
sugar_in_your_tea@sh.itjust.works 2 days ago
Sure. But on the flip side, you can ask it the opposite question (tell me the issues with <belief>) and it’ll do that as well, and you’re not going to get that from a conspiracy theory forum.
qarbone@lemmy.world 1 day ago
I don’t have personal experience with people suffering psychoses but I would think that, if you have the werewithal to ask questions about the opposite beliefs, you’d be noticeably less likely to get suckered into scams and conspiracies.
sugar_in_your_tea@sh.itjust.works 1 day ago
Sure, but at least the option is there.