Are you suggesting that this particular type of CP should be acceptable? (And suddenly “but I used AI” becomes a popular defence.)
Comment on GenAI website goes dark after explicit fakes exposed
jaschen@lemm.ee 2 months ago
Who actually gets hurt in AI generated cp? The servers?
blind3rdeye@lemm.ee 2 months ago
jaschen@lemm.ee 2 months ago
No cp should be acceptable. But I argue AI generated isn’t cp.
This is no different than someone cutting out a child’s head from a Target catalog and sticking it to a body on a playboy magazine and masturbating to it.
Or someone using Photoshoping a kids head to a pornographic photo.
It’s just a more accessible version of those examples.
At the end of the day, what you do in your own home is your thing. t’s not my business what you do. As long as it doesn’t hurt/affect anyone, go ahead.
Ilovethebomb@lemm.ee 2 months ago
I almost respect you for taking a stance so blatantly against what most people believe.
Almost.
Goretantath@lemm.ee 2 months ago
Making a photo of a child based off of real photos in a sexual manner is essentially using said child in the training data as the one in the act…
jaschen@lemm.ee 2 months ago
But who is actually getting hurt? No kid has gotten hurt using Gen AI.
OutDoeHoe@lemmy.world 2 months ago
A child whose abuse images are used to generate AI CP can be re-victimized by it, without even getting at the issues with normalizing it.
jaschen@lemm.ee 2 months ago
Maybe. Nobody can prove it.
mbirth@lemmy.ml 2 months ago
I don’t remember whether it was some news article or a discussion thread. But other people also suggested this might help during therapy and/or rehab. And they had the same argument in that nobody gets harmed in creating these.
As for uses outside of controlled therapy, I’d be afraid it might make people want the “real thing” at some point. And, as others already pointed out: Good luck proving to your local police that those photos on your laptop are all “fake”.
barnaclebutt@lemmy.world 2 months ago
It fetishes the subjects images, and nobody knows if it would lead to recivitism in child predators. It is generally accepted that producing drawings of CP alone is bad, let alone by AI. I remember some dude getting arrested at the Canadian border for sexual drawings of Bart and Lisa. Regardless, I would say that it is quite controversial and probably not what you’d want your company to be known for …
jaschen@lemm.ee 2 months ago
Japan has a vibrant drawn cp market yet they not not even close to the highest rate of child abuse. undispatch.com/here-is-how-every-country-ranks-on…
surewhynotlem@lemmy.world 2 months ago
All the little girls it learned from.
jaschen@lemm.ee 2 months ago
Gen AI doesn’t take cp content and recreates it. There wouldn’t be a point of gen AI if that is the case. It knows what regular porn looks like and what a child looks like and it generates an image. With those inputs it can create something new and at the same time hurt nobody.
surewhynotlem@lemmy.world 2 months ago
Prove it. Please, show me the full training data to guarantee you’re right.
But also, all the kids used for “kids face data” didn’t sign up to be porn
jaschen@lemm.ee 2 months ago
I don’t need to. It’s is just the way gen AI works. It takes images of things it knows and then generates NEW content based on what it think you want with your prompts.
If I’m looking for a infant flying an airplane, gen AI knows what a pilot looks like and what a child looks like and it creates something new.
Also kids face data doesn’t mean they take the actual face of the actual child and paste it on a body. It might take an eyebrow and a freckle from one kidand use a hair style from another and eyes from someone else.
Lastly, the kids parents consented when they upload images of their kids on social media.
barnaclebutt@lemmy.world 2 months ago
Image
jaschen@lemm.ee 2 months ago
I’m no pedo, but what you do in your own home and hurts nobody is your own thing.
reseller_pledge609@lemmy.dbzer0.com 2 months ago
Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.
So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people “get used to” it.
Mnemnosyne@sh.itjust.works 2 months ago
Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.
This is the same reason it can do something like Godzilla with Sailor Moon’s hair, not because it trained on images of Godzilla with Sailor Moon’s hair, but because it can combine those two separate things.
Kusimulkku@lemm.ee 2 months ago
I wouldn’t think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.
MonkderVierte@lemmy.ml 2 months ago
Old cp?
Ah, right, almost finally forgot the killer games rhetoric.