Comment on Paedophiles using open source AI to create child sexual abuse content, says watchdog
BradleyUffner@lemmy.world 1 year agoNo they aren’t. An AI trained on normal every day images of children, and sexual images of adults could easily synthesize these images.
Just like it can synthesize an image of a possum wearing a top hat without being trained on images of possums wearing top hats.
gila@lemm.ee 1 year ago
They’re talking about a Stable Diffusion LoRA trained on actual CSAM. What you described is possible too, it’s not what the article is pointing out though.