Comment on Instagram Is Full Of Openly Available AI-Generated Child Abuse Content.
mic_check_one_two@lemmy.dbzer0.com 2 days agoMy guess is that the algorithm is really good at predicting who will be likely to follow that kind of content, rather than report it. Basically, it flies under the radar purely because the only people who see it are the ones who have a vested interest in it flying under the radar.
General_Effort@lemmy.world 1 day ago
Look again. The explanation is that these images simply don’t look like any kind of CSAM. The whole story looks like some sort of scam to me.