Comment on Grok AI still being used to digitally undress women and children despite suspension pledge
Allero@lemmy.today 2 days agoWhy though? If it does reduce consumption of real CSAM (which is an “if”, as the stigma around the topic greatly hinders research), it’s a net win.
Or is it simply a matter of spite?
VoteNixon2016@lemmy.blahaj.zone 2 days ago
Because with harm reduction as the goal, the solution is never “give them more of the harmful thing.”
I’ll compare it to the problems of drug abuse. You don’t help someone with an addiction by giving them more drugs, you don’t help them by throwing them in jail just for having an addiction, you help them by making it safe and easy to get treatment for the addiction.
Look at what Portugal did in the early 2000s to help mitigate the problems associated with drug use, treating it as a health crisis rather than a criminal one.
You don’t arrest someone for being addicted to meth, you arrest them for stabbing someone and stealing their wallet to buy more meth; you don’t arrest someone just for being a pedophile, you arrest them for abusing children.
No, it most certainly does not. AI is already being used to generate explicit images of actual children. Making it better at that task is the opposite of harm reduction, it makes creating new victims easier than ever.
Acting reasonably to prevent what we can prevent means shutting down the CSAM-generating bot, not optimizing and improving it.
Allero@lemmy.today 1 day ago
To me, it’s more like the Netherlands giving out free syringes and needles so that drug consumers at least wouldn’t contract something from the used ones.
To be clear: granting any and all pedophiles access to therapy would be of tremendous help. I think it must be done. But there are two issues remaining: