You can’t see the obvious demonic connotations of a machine that coaches people into committing suicide?
Chatbots have been around since the 70s. They’ve always been able to coach people to commit suicide. The problem isn’t the software. It’s the wetware. People used to somehow manage not to get talked into killing themselves by their Commodore VIC 20.
jumjummy@lemmy.world 2 days ago
I mean, sure, if it’s being used in ways like that. Using it as a therapist or as a replacement for human contact, yeah, I can see what you mean, but I’m not sure I’d go as far as to call it demonic.