There’s an argument to be made that if the system was trained on real CSAM, then using it to generate such imagery would be immoral - but otherwise I don’t think it is, and this feels like a moral panic.
CSAM is by definition evidence of a crime having happened. You can’t create it without hurting a real human being - that’s why it’s illegal. That logic doesn’t apply to simulated images or cartoons. It might be in bad taste, but nobody was hurt in the making of it, and I’m not aware of any solid evidence that viewing such content makes someone more likely to commit the real crime. Same as there’s no proven link between violent movies/games and increased real-world violence.
There’s really no limit to how far this can be taken. In the past the line was clear: was a child hurt? If yes, illegal. Now we’re effectively moving toward banning violent video games and cartoons. Tomorrow it’s stick-figure fight scenes, and soon you’re not even allowed to think about it.
Of course I’m being hyperbolic here - just trying to make a point. I don’t think “I don’t like it” is justification for banning something if it can’t be shown to cause actual harm. If solid science ever proves it increases the likelihood of offending against real humans, then yeah, that’s different. But I don’t think we have that evidence. Even most pedophiles never offend. The vast majority of people in prison for child sexual abuse are just plain old rapists with no particular fixation on kids - they’re simply easy targets.
Grail@multiverse.soulism.net 1 day ago
https://www.sciencedirect.com/science/article/pii/S0145213424003673
There’s your study. You mentioned video games causing violence. Video games may not cause violence, but video games sure cause video games. People who try out a game and like it go on to become gamers. People who see child porn and like it go on to become CSAM offenders. Once a person becomes a regular CSAM consumer, there’s a higher chance they go to the dark web and pay someone for this content. And at that point, we can see genuine harm.