Comment on Grok AI still being used to digitally undress women and children despite suspension pledge
aesthelete@lemmy.world 4 days agoYeah they probably fed it a bunch of legitimate on/off content as well as stuff from people who used to do make nudes from celebrity photos with sheer / skimpy outfits as a creepy hobby.
nullroot@lemmy.world 4 days ago
Also csam in training data definitely is a thing
Allero@lemmy.today 4 days ago
Honestly, I’d love to see more research on how AI CSAM consumption affects consumption of real CSAM and rates of sexual abuse.
Because if it does reduce them, it might make sense to intentionally use datasets already involved in previous police investigations as training data. But only if there’s a clear reduction.
(Police has already used some materials, with victims’ consent, to crack down on CSAM sharing platforms in the past).
NikkiDimes@lemmy.world 4 days ago
Or we could like…not
VoteNixon2016@lemmy.blahaj.zone 3 days ago
No.
Allero@lemmy.today 3 days ago
Why though? If it does reduce consumption of real CSAM (which is an “if”, as the stigma around the topic greatly hinders research), it’s a net win.
Or is it simply a matter of spite?
bookmeat@lemmynsfw.com 3 days ago
The idea is that to generate csam there was harm done to get the training data. This is why it’s bad.
Allero@lemmy.today 3 days ago
That would be true if children were abused specifically to obtain the training data. But what I’m talking about is using the data that already exists, taken from police investigations and other sources. Of course, it also requires victim’s consent (as they grow old enough), as not everyone will agree to have materials of their abuse proliferate in any way.
Police has already used CSAM with victim’s consent to better impersonate CSAM platform admins in investigative operations, leading to arrests of more child abusers and those sharing the materials around.
The case with AI is milder, as it requires minimum human interaction, so no one will need to re-watch the materials as long as victims are already identified. It’s enough for the police to contact victims, get the agreement, and feed the data into AI without releasing the source. With enough data, AI could improve image and video generation, driving more watches away from real CSAM and reducing rates of abuse.
That is, if it works this way. There’s a glaring research hole in this area, and I believe it is paramount to figure out if it helps. Then, we could decide whether to include already produced CSAM into the data, or if adult data is sufficient to make it good enough for the intended audience to make a switch.