The chatbot was actually pretty irresponsible about a lot of things, looks like. As in, it doesn’t respond the right way to mentions of suicide and tries to convince the person using it that it’s a real person.
This guy made an account to try it out for himself, and yikes: youtu.be/FExnXCEAe6k?si=oxqoZ02uhsOKbbSF
LustyArgonianMana@lemmy.world 3 weeks ago
Well, we commonly hold the view, as a society, that children cannot consent to sex, especially with an adult. Part of that is because the adult has so much more life experience and less attachment to the relationship. In this case, the app engaged in sexual chatting with a minor (I’m actually extremely curious how that’s not soliciting a minor or some indecency charge since it was content created by the AI fornthar specific user). The AI absolutely understands more than most adults let alone a 14 year old boy, and also has no concept of attachment. It seemed pretty clear he was a minor in his conversations to the app. This is definitely an issue.
Sorgan71@lemmy.world 3 weeks ago
It was not sexual. The app cannot produce sexual content.
LustyArgonianMana@lemmy.world 3 weeks ago
echodot@feddit.uk 3 weeks ago
Okay but at what point do you have to draw the line and say beyond this point you have to take parental responsibility?
We don’t even have to say that what the app did was necessarily acceptable we just have to say whether or not we think that the responsibility falls entirely on the app developers. That’s the key, are they entirely responsible here, always everyone involved just a bit useless?
sandbox@lemmy.world 3 weeks ago
It definitely can, it just has to blur the line a bit to get past the content filter
JasonDJ@lemmy.zip 3 weeks ago
I really want like, a Frieda McFadden-style novel about an AI chatbot serial manipulator now. Basically Michelle Carter…the girl who bullied her boyfriend into killing himself. Except the AI can delete or modify all the evidence.
Mongostein@lemmy.ca 3 weeks ago
Whoa, SkyNet doesn’t need terminators. It can just bully us in to killing ourselves.