There are communities on NSFW Lemmy where people intentionally present as children engaged in sexual abuse scenarios. They’re adults and this is their kink that everyone is supposed to tolerate and pretend is ok.
See defederation drama over the last couple of days. What I’m saying is, the hashtags mean nothing.
Aesthesiaphilia@kbin.social 1 year ago
We do know they only found, what, 112 actual images of CP? That's a very small number. I'd say that paints us in a pretty good light, relatively.
dustyData@lemmy.world 1 year ago
112 images out of 325,000 images scanned over two days, is about 0,03% So we are doing pretty well. With more moderation tools we could continue to knock out those sigmas.
blazera@kbin.social 1 year ago
it says 112 instances of known CSAM. But that's based on their methodology, right, and their methodology is not actually looking at the content, it's looking at hashtags and whether google safesearch thinks it's explicit. Which Im pretty sure doesnt differentiate with what the subject of the explicitness is. It's just gonna try to detect breast or genitals I imagine.
Though they do give a few damning examples of things like actual CP trading, but also that they've been removed.
Rivalarrival@lemmy.today 1 year ago
How many of those 112 instances are honeypots controlled by the FBI or another law enforcement agency?