it can hide in plain sight, and then when you dig into someones profile, it can lead to someone or a group discussing CSAM and beastility, not just CP. like a site similar to r/pics, or porn site.
It definitely seems weird how easy it is to stumble upon CP online, and how open people are about sharing it, with no effort made, in many instances, to hide what they’re doing. I’ve often wondered how much of the stuff is spread by pedo rings and how much is shared by cops trying to see how many people they can catch with it.
Ledericas@lemm.ee 3 weeks ago
swelter_spark@reddthat.com 3 weeks ago
I can definitely see how people could find it while looking for porn. I don’t understand how people can do this stuff out in the open with no consequences .
Ledericas@lemm.ee 3 weeks ago
yea its often hidden too well to be easily found out and authorities might want to gather evidence so they let it accumulate then they pounce. one of the sites was mostly inneundos and talking about commiting it, but not actually distributing the material, they co-opt certain images to pervert it.
Cryophilia@lemmy.world 3 weeks ago
If you have stumbled on CP online in the last 10 years, you’re either really unlucky or trawling some dark waters. This ain’t 2006. The internet has largely been cleaned up.
swelter_spark@reddthat.com 3 weeks ago
I don’t know about that.
I spot most of it while looking for out-of-print books about growing orchids on the typical file-sharing networks. The term “blue orchid” seems to be frequently used in file names of things that are in no way related to gardening. The eMule network is especially bad.
When I was looking into messaging clients a couple years ago, to figure out what I wanted to use, I checked out a public user directory for the Tox messaging network and it was maybe 90% people openly trying to find, or offering, custom made CP. On the open internet, not an onion page or anything.
Then maybe last year, I joined openSUSE’s official Matrix channels, and some random person (who, to be clear, did not seem connected to the distro) invited me to join a room called openSUSE Child Porn, with a room logo that appeared to be an actual photo of a small girl being violated by a grown man.
I hope to god these are all cops, because I have no idea how there can be so many pedos just openly doing their thing without being caught.
Cryophilia@lemmy.world 3 weeks ago
I would consider all of these to be trawling dark waters.
Schadrach@lemmy.sdf.org 3 weeks ago
…and most of the people who agree with that notion would also consider reading Lemmy to be “trawling dark waters” because it’s not a major site run by a massive corporation actively working to maintain advertiser friendliness to maximize profits. Hell, Matrix is practically Lemmy-adjacent in terms of the tech.
swelter_spark@reddthat.com 3 weeks ago
File-sharing and online chat seem like basic internet activities to me.
veeloth@lemm.ee 3 weeks ago
not stumbled upon it but I’ve met a couple people offering it on mostly normal discord servers
Ledericas@lemm.ee 3 weeks ago
most definitely not clean lmao, your just not actively searching for it, or stumbling onto it.
Cryophilia@lemmy.world 3 weeks ago
That’s…what I said.
LustyArgonianMana@lemmy.world 3 weeks ago
Search “AI woman porn miniskirt,” and tell me you don’t see questionable results in the first 2 pages, of women who at least appear possibly younger than 18.
Fuck, the head guy of Reddit, u/spez, was the main mod of r/jailbait before he changed the design of reddit so he could hide mod names. Also, look into the u/MaxwellHill / Ghilisaine Maxwell conspiracy on Reddit.
Cryophilia@lemmy.world 3 weeks ago
Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don’t want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn’t find any AI CP, but even if I did, it’s in an entire different universe of morally bad.
That was, what, fifteen years ago? It’s why I said “in the last decade”.
LustyArgonianMana@lemmy.world 3 weeks ago
“Clearly in their late teens,” sure.
Obviously there’s a difference with AI porn vs real, that’s why I told you to search AI in the first place??? The convo isn’t about AI porn, but AI porn uses images to seed their new images including CSAM