Comment on Kids are making deepfakes of each other, and laws aren’t keeping up
BombOmOm@lemmy.world 3 days agoTaking nude pictures of someone is quite a bit different than…not taking nude pictures of them.
It’s not CSAM to put a picture of someone’s face on an adult model.
atomicorange@lemmy.world 3 days ago
How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?
BombOmOm@lemmy.world 3 days ago
It’s clear sexual harassment. However, you can’t just claim something has underage nudity when the nudity is of an adult model.
atomicorange@lemmy.world 3 days ago
Yes, it’s sexual abuse of a child, the same way taking surreptitious locker room photos would be. There’s nothing magical about a photograph of real skin vs a fake. The impact to the victim is the same. The impact to the viewer of the image is the same. Arguing over the semantic definition of “abuse” is getting people tangled up here. If we used the older term, “child porn” people wouldn’t be so hesitant to call this what it is.