Think of it as nonconsensual AI generated CSAM. Like nonconsensual AI porn, you take public SFW photos and use the AI to generate explicit images using the provided photo as reference for the abused victim.
As for the over share library, think of all the pictures you see of peoples’ kids in your social media feed, then consider how few people take proper privacy controls on what they post (or just intentionally post them publicly for laughs/attention). All of those images can be used as the basis for generation nonconsensual AI porn/CSAM.
3dcadmin@lemmy.relayeasy.com 1 day ago
They do not need to be sexual abusive material, they find easily scraped images and then use AI to make the sexually abusive or pornographic to then blackmail the child/children and their parents. For instance, a case 2 weeks back, young mother shared images of her kids, not sexual but a few in swimsuits when they were young. These were then doctored and sent to this girl who is now older to blackmail her and then on to her mother in an attempt to blackmail her as well. The pictures were shared with a lack of understanding of privacy at the time so anyone could see them. Police struggling to find out who is blackmailing the person, and struggling to find a reason to actually investigate as they say it is a likeness of the person not the actual person and it was shared to the world years ago meaning permission was given (ie the picture was allowed to be shared at the time). Now of course I am not revealing any info as it is a current police investigation and that would be illegal but it appears to be going nowhere yet is disturbing to the kid and the mother