I think generating and sharing sexually explicit images of a person without their consent is abuse.
That’s distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I’m morally uncomfortable criminalizing an act that has no victim.
lka1988@lemmy.dbzer0.com 9 months ago
Except, you know, the harassment and assault (remember, assault doesn’t need contact to qualify as such) of said deepfaked individual. Which is sexual in nature. Sexual harassment and assault of a child using materials generated based on the child’s identity.
Maybe we could have a name for it. Something like Child-based sexual harassment and assault material… CSHAM, or maybe just CSAM, you know, to remember it more easily.