You’re not wrong, but isnt that also how Better Help works?
Comment on What could go wrong?
ininewcrow@lemmy.ca 1 month ago
A human therapist might not or is less likely to share any personal details about your conversations with anyone.
An AI therapist will collate, collect, catalog, store and share every single personal detail about you with the company that owns the AI and share and sell all your data to the highest bidder.
Crewman@sopuli.xyz 1 month ago
exu@feditown.com 1 month ago
And you know how Better Help is very scummy?
Saledovil@sh.itjust.works 5 weeks ago
If a company pays for youtube sponsorships, they’re likely a scam.
essell@lemmy.world 1 month ago
Better help is the Amazon of the Therapy world.
desktop_user@lemmy.blahaj.zone 5 weeks ago
the AI therapist probably can’t force you into a psych ward though, a human psychologist is obligated to (under the right conditions).
Krauerking@lemy.lol 5 weeks ago
Who says that’s not coming in the next paid service based on this great idea for chatbots to provide therapy to the abused masses.
desktop_user@lemmy.blahaj.zone 5 weeks ago
nobody, but local will continue to be an option (unless the government fucks up the laws)
WR5@lemmy.world 5 weeks ago
I’m not advocating for it, but it could be just locally run and therefore unable to share anything?
bitjunkie@lemmy.world 5 weeks ago
The data isn’t useful if the person no longer exists.
DaddleDew@lemmy.world 1 month ago
Neither would a human therapist be inclined to find the perfect way to use all this information to manipulate people while they are being at their weakest.
They are also pushing for the idea of an AI “social circle” for increasingly socially isolated people through which world view and opinions can be bent to whatever whoever controls the AI desires.
What we are currently witnessing is a push for a human mind manipulation and control experiment that will make the Cambridge Analytica scandal look like a fun joke.
fullsquare@awful.systems 1 month ago
PSA that Nadella, Musk, saltman (and handful of other techfash) own dials that can bias their chatbots in any way they please. If you use chatbots for writing anything, they control how racist your output will be