Honestly, if the existing victims have to deal with a few more people masturbating to the existing video material and in exchange it leads to fewer future victims it might be worth the trade-off but it is certainly not an easy choice to make.
Geetnerd@lemmy.world 2 months ago
[deleted]ZILtoid1991@lemmy.world 2 months ago
Issue is, AI is often trained on real children, sometimes even real CSAM(allegedly), which makes the “no real children were harmed” part not necessarily 100% true.
Also since AI can generate photorealistic imagery, it also muddies the water for the real thing.
Ledericas@lemm.ee 2 months ago
that is still cp, and distributing CP still harms childrens, eventually they want to move on to the real thing, as porn is not satisfying them anymore.
Schadrach@lemmy.sdf.org 2 months ago
eventually they want to move on to the real thing, as porn is not satisfying them anymore.
Isn’t this basically the same argument as arguing violent media creates killers?
quack@lemmy.zip 2 months ago
Understand you’re not advocating for it, but I do take issue with the idea that AI CSAM will prevent children from being harmed. While it might satisfy some of them (at first, until the high from that wears off and they need progressively harder stuff), a lot of pedophiles are just straight up sadistic fucks and a real child being hurt is what gets them off. I think it’ll just make the “real” stuff even more valuable in their eyes.
Corno@lemm.ee 2 months ago
I feel the same way. I’ve seen the argument that it’s analogous to violence in videogames, but it’s pretty disingenuous since people typically play videogames to have fun and for escapism, whereas with CSAM the person seeking it out is doing so in bad faith. A more apt comparison would be people who go out of their way to hurt animals.
Schadrach@lemmy.sdf.org 1 month ago
A more apt comparison would be people who go out of their way to hurt animals.
Is it? That person is going out of their way to do actual violence. It feels like arguing someone watching a slasher movie is more likely to make them go commit murder is a much closer analogy to someone watching a cartoon of a child engaged in sexual activity or w/e being more likely to make them molest a real kid.
misteloct@lemmy.world 2 months ago
Somehow I doubt allowing it actually meaningfully helps the situation. It sounds like an alcoholic arguing that a glass of wine actually helps them not drink.
danny161@discuss.tchncs.de 2 months ago
That would be acceptable, but that’s unfortunately not what’s happening. Like I said: due to the majority of the files being hosted on file sharing platforms of the normal web, it’s way more effective to let them be deleted by those platforms. Some investigative journalist tried this, and documented the frustration among the users of child porn websites, to reload all the GBs/TBs of material, so that they rather quitted and shut down their sites, than going the extra mile
yetAnotherUser@discuss.tchncs.de 2 months ago
It doesn’t though.
The most effective way to shut these forums down is to register bot accounts scraping links to the clearnet direct-download sites hosting the material and then reporting every single one.
If everything posted to these forums is deleted within a couple of days, their popularity would falter. And victims much prefer having their footage deleted than letting it stay up for years to catch a handful of site admins.
Frankly, I couldn’t care less about punishing the people hosting these sites. It’s an endless game of cat and mouse and will never be fast enough to meaningfully slow down the spread of CSAM.
taladar@sh.itjust.works 2 months ago
Who said anything about punishing the people hosting the sites. I was talking about punishing the people uploading and producing the content. The ones doing the part that is orders of magnitude worse than anything else about this.
yetAnotherUser@discuss.tchncs.de 2 months ago
I’d be surprised if many “producers” are caught. From what I have heard, most uploads on those sites are reuploads because it’s magnitudes easier.
Of the 1400 people caught, I’d say maybe 10 were site administors and the rest passive “consumers” who didn’t use Tor. I wouldn’t put my hopes up too much that anyone who was caught ever committed child abuse themselves.
I mean, 1400 identified out of 1.8 million really isn’t a whole lot to begin with.
taladar@sh.itjust.works 2 months ago
If most are reuploads anyway that kills the whole argument that deleting things works though.