Comment on Kids are making deepfakes of each other, and laws aren’t keeping up
lath@lemmy.world 1 week agoSchools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.
wewbull@feddit.uk 1 week ago
Disagree. Not CSAM when no abuse has taken place.
That’s my point.
Zak@lemmy.world 1 week ago
I think generating and sharing sexually explicit images of a person without their consent is abuse.
That’s distinct from generating an image that looks like CSAM without the involvement of any real child. While I find that disturbing, I’m morally uncomfortable criminalizing an act that has no victim.
kemsat@lemmy.world 1 week ago
Harassment sure, but not abuse.
atomicorange@lemmy.world 1 week ago
If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?
If so, how is the psychological effect of a convincing deepfake any different?
BombOmOm@lemmy.world 1 week ago
Taking nude pictures of someone is quite a bit different than…not taking nude pictures of them.
It’s not CSAM to put a picture of someone’s face on an adult model.
atomicorange@lemmy.world 1 week ago
How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?
General_Effort@lemmy.world 1 week ago
If someone puts a camera in a locker room, that means that someone entered a space where you would usually feel safe. It implies the potential of a physical threat.
It also means that someone observed you when you were doing “secret” things. One may feel vulnerable in such situations. Even a seasoned nude model might be embarrassed to be seen while changing, maybe in a dishevelled state.
I would think it is very different. Unless you’re only thinking about the psychological effect on the viewer.
lath@lemmy.world 1 week ago
There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.
Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.
Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.
The intention to exploit.
lka1988@lemmy.dbzer0.com 1 week ago
Except, you know, the harassment and assault (remember, assault doesn’t need contact to qualify as such) of said deepfaked individual. Which is sexual in nature. Sexual harassment and assault of a child using materials generated based on the child’s identity.
Maybe we could have a name for it. Something like Child-based sexual harassment and assault material… CSHAM, or maybe just CSAM, you know, to remember it more easily.