ai generated content is like plastic pollution
Comment on Instagram Is Full Of Openly Available AI-Generated Child Abuse Content.
General_Effort@lemmy.world 5 weeks ago
When I saw this, 2 questions came to mind: How come that this isn’t immediately reported? Why would anyone upload illegal material to a platform that tracks as thoroughly as Meta’s do?
The answer is:
All of those accounts followed the same visual pattern: blonde characters with voluptuous bodies and ample breasts, blue eyes, and childlike faces.
The 1 question that came to mind upon reading this is: What?
nutsack@lemmy.dbzer0.com 5 weeks ago
mic_check_one_two@lemmy.dbzer0.com 5 weeks ago
My guess is that the algorithm is really good at predicting who will be likely to follow that kind of content, rather than report it. Basically, it flies under the radar purely because the only people who see it are the ones who have a vested interest in it flying under the radar.
General_Effort@lemmy.world 5 weeks ago
Look again. The explanation is that these images simply don’t look like any kind of CSAM. The whole story looks like some sort of scam to me.
WeirdGoesPro@lemmy.dbzer0.com 5 weeks ago
I’m a little confused as to how it can still be AI CSAM if the bodies are voluptuous and the breasts are ample. Childlike faces have been the bread and butter of face filters for years.
Which parts specifically have to be childlike for it to be AI CSAM? This is why we need some laws ASAP.
sik0fewl@lemmy.ca 5 weeks ago
Things that you want to understand but sure as fuck ain’t gonna Google.