There will be a lot of medical literature with photos of children’s bodies to demonstrate conditions, illnesses, etc.
smeg@infosec.pub 3 weeks ago
All of the AI tools know how to make CP somehow - probably because their creators fed it to them.
stoly@lemmy.world 2 weeks ago
phoenixz@lemmy.ca 2 weeks ago
Yeah, press X to doubt that AI is generating child pornography from medical literature.
These fuckers have fed AI anything and everything to train them. They’ve stolen everything they could without repercussions, I wouldn’t be surprised if some of them fed their AIs child porn because “data is data” or something like that.
vaultdweller013@sh.itjust.works 2 weeks ago
Depending on how they scraped data they may have just let their rovers run wild. Eventually they wouldve ran into child porn, which is also yet another reason why this tech is utterly shit. If you can’t control your tech you shouldn’t have it and frankly speaking curation is a major portion of any data processing.
phx@lemmy.world 2 weeks ago
They fed them on the Internet including libraries of pirated material. It’s like drinking from a fountain at a sewage plant
Grimy@lemmy.world 3 weeks ago
If it knows what children looks like and knows what sex looks like, it can extrapolate. That being said, I think all photos of children should be removed from the datasets, regardless of the sexual content.
Rooster326@programming.dev 3 weeks ago
Obligatory it doesn’t “know” what anything looks like.
Grimy@lemmy.world 3 weeks ago
Thank you, I almost forgot. I was busy explaining to someone else how their phone isn’t actually smart.