The movie you’re thinking of is Her with Joaquin Phoenix and Scarlett Johansson, and in the story she’s a true general AI.
Comment on Help.
ZkhqrD5o@lemmy.world 7 months ago
One thing that comes to mind is that prostitution, no matter how you spin it, is still a social job. If you get a problematic person like that in prostitution, there are good chances that said prostitute would be able to talk their customer out of doing some nonsense. If not for empathy, for the simple fact that there would be legal consequences for not doing so.
Do you think a glorified spreadsheet that people call husband would behave the same? Don’t know if it happened but one of these days LLMs will talk people into doing something very nasty and then it’s going to be no one’s fault again, certainly not the host of the LLM. We really live in a boring dystopia.
bitjunkie@lemmy.world 7 months ago
jaemo@sh.itjust.works 7 months ago
Yeah but as a husband, we have waste heat to manage, plus just normal waste.
AbsolutelyNotAVelociraptor@sh.itjust.works 7 months ago
Also, my quite human husband also voices his thoughts without a prompt. Lol. You only need to feed him to function, no internet required.
Sometimes, with humans, I’d say the problem is quite the opposite: they voice their thoughts without a prompt far more often than what would be desirable.
On a less serious note, that quoted part made me chuckle.
ZkhqrD5o@lemmy.world 7 months ago
They shouldn’t be so harsh to LLMs. They have something in common with humans after all. If you stick a patch cable with internet up theirs, they will become very talkative very quickly.
shalafi@lemmy.world 7 months ago
ZkhqrD5o@lemmy.world 7 months ago
kazerniel@lemmy.world 7 months ago
holy shit, that movie is 12 years old 👴
humanspiral@lemmy.ca 7 months ago
A problem with LLM relationships is the monetization model for the LLM. Its “owner” either receives a monthly fee from the user, or is able to get data from the user to monetize selling them stuff. So the LLM is deeply dependant on the user, and is motivated to manipulate a returned codependency to maintain its income stream. This is not significantly different than the therapy model, but the user can fail to see through manipulation compared to “friends/people who don’t actually GAF” about maintaining a strong relationship with you.
kazerniel@lemmy.world 7 months ago
That’s why therapists have ethical guidelines and supervision. (Also they are typically people who are driven to help, not exploit the vulnerable.) None of these are really present with those glorified autocompletes.
humanspiral@lemmy.ca 7 months ago
one big difference between an AI friend and therapy is that therapy requires an effort per visit, even if insurance is providing unlimited access. Without acknowledging the power of ethical guidelines as guard rails, the LLM is motivated to sustain the subscription and datacollection stream.