Comment on Grok AI still being used to digitally undress women and children despite suspension pledge
bookmeat@lemmynsfw.com 2 days agoThe idea is that to generate csam there was harm done to get the training data. This is why it’s bad.
Comment on Grok AI still being used to digitally undress women and children despite suspension pledge
bookmeat@lemmynsfw.com 2 days agoThe idea is that to generate csam there was harm done to get the training data. This is why it’s bad.
Allero@lemmy.today 2 days ago
That would be true if children were abused specifically to obtain the training data. But what I’m talking about is using the data that already exists, taken from police investigations and other sources. Of course, it also requires victim’s consent (as they grow old enough), as not everyone will agree to have materials of their abuse proliferate in any way.
Police has already used CSAM with victim’s consent to better impersonate CSAM platform admins in investigative operations, leading to arrests of more child abusers and those sharing the materials around.
The case with AI is milder, as it requires minimum human interaction, so no one will need to re-watch the materials as long as victims are already identified. It’s enough for the police to contact victims, get the agreement, and feed the data into AI without releasing the source. With enough data, AI could improve image and video generation, driving more watches away from real CSAM and reducing rates of abuse.
That is, if it works this way. There’s a glaring research hole in this area, and I believe it is paramount to figure out if it helps. Then, we could decide whether to include already produced CSAM into the data, or if adult data is sufficient to make it good enough for the intended audience to make a switch.
bookmeat@lemmynsfw.com 2 days ago
You’re going to tell me that there’s no corporation out there that won’t pay to improve their model with fresh data and not ask questions about where that data came from?
Allero@lemmy.today 1 day ago
I think such matters should be kept strictly out of corporate hands