Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.
If the person in the image is underaged then it should be classified as child pornography. If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.
Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.
lath@lemmy.world 3 weeks ago
Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.
wewbull@feddit.uk 3 weeks ago
Disagree. Not CSAM when no abuse has taken place.
That’s my point.
Zak@lemmy.world 3 weeks ago
I think generating and sharing sexually explicit images of a person without their consent is abuse.
That’s distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I’m morally uncomfortable criminalizing an act that has no victim.
kemsat@lemmy.world 3 weeks ago
Harassment sure, but not abuse.
atomicorange@lemmy.world 3 weeks ago
If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?
If so, how is the psychological effect of a convincing deepfake any different?
BombOmOm@lemmy.world 3 weeks ago
Taking nude pictures of someone is quite a bit different than…not taking nude pictures of them.
It’s not CSAM to put a picture of someone’s face on an adult model.
General_Effort@lemmy.world 3 weeks ago
If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.
It also means that someone observed you when you were doing “secret” things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.
I would think it is very different. Unless you’re only thinking about the psychological effect on the viewer.
lath@lemmy.world 3 weeks ago
There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.
Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.
Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.
The intention to exploit.
lka1988@lemmy.dbzer0.com 3 weeks ago
Except, you know, the harassment and assault (remember, assault doesn’t need contact to qualify as such) of said deepfaked individual. Which is sexual in nature. Sexual harassment and assault of a child using materials generated based on the child’s identity.
Maybe we could have a name for it. Something like Child-based sexual harassment and assault material… CSHAM, or maybe just CSAM, you know, to remember it more easily.