Comment on Teen killed himself after ‘months of encouragement from ChatGPT’, lawsuit claims
audaxdreik@pawb.social 17 hours agoI definitely do not agree.
While they may not be entirely blameless, we have adults falling into this AI psychosis like the [prominent OpenAI investor](Prominent OpenAI Investor).
What regulations are in place to help with this? What tools for parents? Isn’t this being shoved into literally every product in everything everwhere? Actually pushed on them in schools?
How does a parent monitor this? What exactly does a parent do? There could have been signs they could have seen in his behavior, but could they have STOPPED this situation from happening as it was?
This technology is still not well understood. I hope lawsuits like this shine some light on things and kick some asses. Get some regulation in place.
This is not the parent’s fault and seeing so many people declare it just feels like apoligist AI hype.
Scipitie@lemmy.dbzer0.com 17 hours ago
I see your point but there is one major difference between adults and children: adults are by default fully responsible for themselves z children are not.
As for your question: I won’t blame the parents here in the slightest because they will likely put more than enough blame on themselves. Instead I’ll try to keep it general:
Independent of technology, what a parent can do is learn behavior and communication patterns that can be signs of mental illness.
That’s independent of the technology.
This is a big task because the border between normal puberty and behavior that warrants action is slim to non-existent.
Overall I wish for way better education for parents both in terms of age appropriate patterns as well as what kind of help is available to them depending on their country and culture.
Spuddlesv2@lemmy.ca 11 hours ago
They already had the kid in therapy. That suggests they were involved enough in his life to know he needed professional help. Other than completely removing his independence, effectively becoming his jailers, what else should they have done?
Scipitie@lemmy.dbzer0.com 6 hours ago
In the very first post on this thread I pointed out that I’m not talking about this specific case at all.
Spuddlesv2@lemmy.ca 2 hours ago
Fair enough but in the post I replied to you did say you won’t blame the parents “here” at all, which to me means “here in this specific case”.
audaxdreik@pawb.social 17 hours ago
I think you miss my point. I’m saying that adults, who should be capable of more mature thought and analysis, still fall victim to the manipulative thinking and dark patterns of AI. Meaning that children and teens obviously stand less of a chance.
This is of course true for all parents in all situations. What I’m saying is that it is woefully inadequate to deal with the type and pervasiveness of the threat presented by AI in this situation.
Scipitie@lemmy.dbzer0.com 17 hours ago
To your last point I fully agree!
For the first point: that’s how I understood you - what I failed to convey: adultsshould fall victim more in cases like this because parents can be a protective shield of a kind that grown-ups lag.
Children on their own stand easy less of a chance but are very rarely on their own.
And to be honest I think it doesn’t change result of requirements for action both in general but respectfully for language based bots, both from a legal as well as an educational point of view.