Comment on A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It
Midvikudagur@lemmy.world 2 days ago“Child pornography” is a term NGO’s and Law enforcement are trying to get phased out. It makes it sound like CSAM is related to porn, when in fact it is simply abuse of a minor.
TipsyMcGee@lemmy.dbzer0.com 2 days ago
Still a bit confusing to me. In my jurisdiction, child pornography is the legal term, which really applies independently of whether anyone is abused or have had their sexual integrity violated (e.g a 17 year old, above the age of consent, voluntarily is depicted in a sexual context and voluntarily shares the content with peers, e.g. depicts a child that doesn’t exist)