This genie is probably impossible to get back in the bottle.
People are going to just direct the imitative so called AI program to make the face just different enough to have plausible deniability that it’s a fake of this person or that person. Or use existing tech to age them to 18+ (or 30+ or whatever). Or darken or lighten their skin or change their eye or hair color. Or add tattoos or piercings or scars…
I’m not saying we should be happy about it, but it is here and I don’t think it’s going anywhere. Like, if you tell your so called AI to give you a completely fictional nude image or animation of someone that looks similar to Taylor Swift but isn’t Taylor Swift, what’s the privacy (or other) violation, exactly?
Does Taylor Swift own every likeness that looks somewhat like hers?
otp@sh.itjust.works 6 months ago
The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.
If we allow people to use this tech for adults (which we really shouldn’t), then we have to accept that people will use the same tech on minors. It isn’t even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it’s still something that very obviously shouldn’t be happening.
Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.
The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them “”“legitimately”“” would want to use it…or to just ban them outright.
micka190@lemmy.world 6 months ago
I get the point you’re tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it’s still possession.
BrianTheeBiscuiteer@lemmy.world 6 months ago
And that’s still a bit messed up. It’s a felony for a teen to have nude pictures of themselves and they’ll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a “friend” to pedophiles.
Zorque@kbin.social 6 months ago
Which is more of a "zero-tolerance" policy, like unto giving the same punishment to a student defending themselves as the one given to the person who initiated the attack.
otp@sh.itjust.works 6 months ago
I agree, and on the one hand, I understand why it could be good to consider it illegal (to prevent child porn from existing), but it does also seem silly to treat it as a case of pedophilia.
vzq@lemmy.blahaj.zone 6 months ago
That’s a lot of words to defend fake child porn made out of photos and videos of actual children.
NOT_RICK@lemmy.world 6 months ago
Reading comprehension not a strong suit? Sounds to me they’re arguing for protections for both adults AND minors.
Zorque@kbin.social 6 months ago
That's about the right amount of words to completely ignore the sentiment of a statement so you can make a vapid holier-than-thou statement based on purported moral superiority.
Fosheze@lemmy.world 6 months ago
Have you tried actually reading what they said instead of just making shit up?
otp@sh.itjust.works 6 months ago
Uh…this is the second sentence or so (and the start of the second paragraph, I think)
So I’m not sure where you got the idea that I’m defending AI-generated child porn.
Unless you’re so adamant about AI porn generators existing that banning their usage on adults (or invading the privacy of the users and victims with oversight) is outright unthinkable? Lol
I’m saying that IF the technology exists, people will be using it on pictures of children. We need to keep that in mind when we think about laws for this stuff. It’s not just adults uploading pictures of themselves (perfectly fine) or adult celebrities (not fine, but probably more common than any truly acceptable usage).
WallEx@feddit.de 6 months ago
What a dumb take. And I do those myself, so I know one if I see one.