AI can generate images of things that don’t even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.
Comment on Kids are making deepfakes of each other, and laws aren’t keeping up
gkpy@feddit.org 2 days agoCheers for the explanation, had no idea that’s how it works.
So it’s even worse than /u/danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!
swelter_spark@reddthat.com 2 days ago
Vinstaal0@feddit.nl 2 days ago
You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.
People have also put dots on people’s clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.
some_guy@lemmy.sdf.org 2 days ago
There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don’t necessarily need to use illegal material to train to get illegal output.