Comment on Paedophiles using open source AI to create child sexual abuse content, says watchdog
jpeps@lemmy.world 1 year agoI’m no expert, but I think this is the wrong take. I see two potential issues here:
- It can be difficult for people not to escalate behaviour. Say if sexual harrasment somehow became legal or premissable. What new percentage of sexual assault might start? CSAM can be just the beginning for some people.
- This makes CSAM more easily available, and therefore likely means more people are accessing/generating it. See point one.
FMT99@lemmy.world 1 year ago
I think you’re edging towards thought crime territory here. No one is saying the real thing should be legal or the attraction should be socially acceptable.
But if this reduces the market for the real thing I see that as a win.
jpeps@lemmy.world 1 year ago
In my comment I’m stating that I think (again, not an expert) that it would grow the market and ultimately increase child abuse.
Secondly on the subject of legality, of course it will depend on your country, but I believe in most places there is no distinction, at least currently, between ‘simulated’ CSAM and CSAM that directly involves the exploitation of minors. If you put a thought to paper, it’s not a thought anymore.
I think that the attraction should more or less be socially acceptable. For a lot of people it cannot be helped. People need to be able to ask for help and receive support for what I imagine is an extremely tough thing to deal with. I do not think in the slightest that CSAM in any form, including simulated (for which hand drawn would also fit here), should be remotely socially acceptable.