Yes at a cursory glance that’s true. AI generated images don’t involve the abuse of children, that’s great. The problem is what the follow-on effects of this is. What’s to stop actual child abusers from just photoshopping a 6th finger onto their images and then claiming the it’s AI generated? AI image generation is getting absurdly good now, nearly indistinguishable from actual pictures. By the end of the year I suspect they will be truly indistinguishable. When that happens, how to do you tell which images are AI generated and which are real? How do you know who is peddling real CP and who isn’t if AI-generated CP is legal?
Comment on Instagram Is Full Of Openly Available AI-Generated Child Abuse Content.
FauxLiving@lemmy.world 11 months ago
Child Sexual Abuse Material is abhorrent because children were literally abused to create it.
AI generated content, though disgusting, is not even remotely on the same level.
The moral panic around AI that leads to implying that these things are the same thing is absurd.
Go after the people filming themselves literally gang raping toddlers, not the people typing forbidden words into an image generator.
Don’t dilute the horror of the production CSAM by equating it to fake pictures.
suicidaleggroll@lemm.ee 11 months ago
mic_check_one_two@lemmy.dbzer0.com 11 months ago
What’s to stop actual child abusers from just photoshopping a 6th finger onto their images and then claiming that it’s AI generated?
Aside from the other arguments people have presented, this wrecks one of the largest reasons that people produce CSAM. Pedophiles are insular data hoarders by necessity, because actually creating and procuring it is such a big risk. Every time they go online to find new content, they’re at risk of stumbling into a honeypot. And producing it requires IRL work, and a LOT of risk of being caught/turned in by the victim. They tend to form tight-knit rings, and one of the only reliable ways to get into a ring as an outsider is to provide your own CSAM to the others. CSAM is traded in these rings like baseball cards, where you need fresh content in order to receive fresh content.
The data hoarding side of things is where all of the “cops bust pediphile with 100TB of CSAM” headlines come from; In reality, it was probably like 1TB of videos, (which is a lot, but not unheard of) but was backed up multiple times in multiple places, because losing it would be catastrophic for the CSAM producer; They can’t simply go grab a new blue ray of it. And the cops counted the full size of each backup disk, not just the space that was used.
Intentionally marking your content as AI-generated would ruin the trading value, because nobody will see it as valuable/worth trading for if it’s fake. At best, you won’t get anything for it. At worst, you’d be labeled a cop trying to pass off AI content to gather evidence.
FauxLiving@lemmy.world 11 months ago
What’s the follow on effect from making generated images illegal?
Do you want your freedom to be at stake where the question before the Jury is “How old is this image of a person (that doesn’t exist?)”. “Is this fake person TOO child-like?”
When that happens, how do you tell which images are AI generated and which are real? How do you know who is peddling real CP and who isn’t if AI-generated CP is legal?
You won’t be able to tell, we can assume that this is a given.
So the real question is:
Who are you trying to arrest and put in jail and how are you going to write that difference into law so that innocent people are not harmed by the justice system?
To me, the evil people are the ones harming actual children. Trying to blur the line between them and people who generate images is a morally confused position.
There’s a clear distinction between the two groups and that distinction is that one group is harming people.
ExLisper@lemmy.curiana.net 11 months ago
If pedophiles won’t be able to tell what’s real and what’s AI generated why risk jail to create the real ones?
Grimy@lemmy.world 11 months ago
Although that’s true, such material can easily be used to groom children which is where I think the real danger lies.
I really wish they had excluded children in the datasets.
You can’t really put a stop to it anymore but I don’t think it should be something that’s normalized and accepted because there isn’t a direct victim anymore.
Cryophilia@lemmy.world 11 months ago
This literally makes no sense.
DNS@discuss.online 11 months ago
Found the guy who watches said content. I hope you never plan on having kids.
Cryophilia@lemmy.world 11 months ago
Grimy@lemmy.world 11 months ago
Kids will do things if they see other children doing it in pictures and videos. It’s easier to normalize sexual behavior with cp then without.
Cryophilia@lemmy.world 11 months ago
This sounds like you’re searching really hard for a reason to justify banning it. Pretty tenuous “what if” there.
Like, a dildo could hypothetically be used to sexualize a child. Should we ban dildos?
It’s so vague it could apply to anything.