Yeah… But in order to make bubble hash you need a shitload of weed trimmings. It’s not like your just gonna watch a YouTube video, then a few hours later have a bunch of drugs you created… Unless you already had the drugs in the first place.
Also Google search results and YouTube videos arent personalized for every user, and they don’t try to pretend that they are a person having a conversation with you
morto@piefed.social 7 months ago
Yes, it is. People are personifying llms and having emotional relationships with them, what leads to unpreceded forms of abuse. Searching for shit on google or youtube is a thing, but being told by some entity you have emotional links to do something is much worse.
TheMonk@lemmings.world 7 months ago
I don’t remember reading about sudden shocking numbers of people getting “Google-induced psychosis.”
ChaptGPT and similar chatbots are very good at imitating conversation. Think of how easy it is to suspend reality online—pretend the fanfic you’re reading is canon, stuff like that. When those bots are mimicking emotional responses, it’s very easy to get tricked, especially for mentally vulnerable people. As a rule, the mentally vulnerable should not habitually “suspend reality.”
TheReanuKeeves@lemmy.world 7 months ago
I think we need a built in safety for people who actually develop an emotional relationship with AI because that’s not a healthy sign
postmateDumbass@lemmy.world 7 months ago
Good thing capitalism has provided you an AI chat bot psychiatrist to help you not depend on AI for mental and emotional healrh.