AI-generated child sex imagery has every US attorney general calling for action::“A race against time to protect the children of our country from the dangers of AI.”
Isn’t AI generated better than content sourced from real life? It could actually drive a reduction in child sexual abuse instances due to offenders leveraging alternative sources.
db2@sopuli.xyz 9 months ago
They’re not pictures of real people, proceeding against it on that basis undermines the point and makes them look like idiots. It should be banned on principle but ffs there’s got to be a way that doesn’t look reactionary and foolish.
ram@lemmy.ca 9 months ago
Except when they are pictures of real people doing a body swap
db2@sopuli.xyz 9 months ago
That isn’t at all what an AI generated image is. People have been doing that for better than 50 years.
BadRS@lemmy.world 9 months ago
Thats been possible since before photoshop and certainly is possible after
fidodo@lemm.ee 9 months ago
Shouldn’t that already be covered under revenge porn laws?
newthrowaway20@lemmy.world 9 months ago
But aren’t these models built from source material? I imagine if you want CP AI, you need actual CP to train it, no? That definitely would be a problem.
DLSchichtl@lemmy.world 9 months ago
Not necessarily. You could mix PG pics of kids with a Lora trained on pubescent looking but otherwise legal naughty bits. At least that would be the sane way to do it. But hey, world’s full of insane people.
Rivalarrival@lemmy.today 9 months ago
No, you can use a genetic algorithm. You have your audience rate a legal, acceptable work. You present the same work to an AI and ask it to manipulate traits, and provide a panel of works to your audience. Any derivative that the audience rates better than the original is then given back to the AI for more mutations.
Feed all your mutation and rating data to an AI, and it can begin to learn what the audience wants to see.
Have a bunch of pedophiles doing the training, and you end up with “BeyondCP”.
NegativeInf@lemmy.world 9 months ago
My question is where did they get the training data for a sufficiently advance CP image generator. Unless it’s just ai porn with kids faces? Which is still creepy, but I guess there are tons of pictures that people post of their own kids?
SinningStromgald@lemmy.world 9 months ago
Manga, manwha(?) CG sets etc of shota/loli. Sprinkle in some general child statistics for height, weight etc . And I’m sure social media helped as well, with people making accounts for their babies for God sake.
Plenty of “data” I’m sure to train up an AI.
bobs_monkey@lemm.ee 9 months ago
Wouldn’t put it past some suck fucks to feed undesirable content into an AI training
cerberus@lemmy.world 9 months ago
It’s unacceptable in any form.
Kerfuffle@sh.itjust.works 9 months ago
It’s obviously very distasteful but those needs don’t just go away. If people with that inclination can’t satisfy their sexual urges at home just looking at porn, it seems more likely they’re going to go out into the world and try to find some other way to do it.
Also, controlling what people do at home that isn’t affecting anyone else, even in a case like this isn’t likely to target exactly just those people and it’s also very likely not to stop there either. I’d personally be very hesitant to ban/persecute stuff like that unless there was actual evidence that it was harmful and that the cure wasn’t going to be worse than the disease.
PilferJynx@lemmy.world 9 months ago
Humans have been raping kids since our inception. Childhood is a relatively modern concept that young adults are now apart of. It’s an ugly and pervasive subject that needs further study to reduce child harm.
db2@sopuli.xyz 9 months ago
Nobody here said anything otherwise.
Tibert@compuverse.uk 9 months ago
Whatever you wanted to say I do not unserstand your point.
It’s not about protecting directly real people images, it’s about preventing Pedophilia.
Pedophilia IS AN ADDICTION!!! Fueling it with anything, even AI, will worsen the ADDICTION!
hh93@lemm.ee 9 months ago
No one chooses to be a pedophile - as far as we know it’s just the unluckiest possible sexual attraction.
Stigmatizing it won’t help anyone - those people need help and everything that doesn’t hurt real children until they get themselves that psychological help is good in my book