Calling it an invasion of privacy is a stretch the way that copyright infringement is called theft.
Comment on Grok AI still being used to digitally undress women and children despite suspension pledge
nullroot@lemmy.world 2 days agoIt’s not an artistic representation, it’s worse. It’s algorithmic and to that extent it actually has a pretty good idea of what a person looks like naked based on their picture. That’s why it’s so disturbing.
bookmeat@lemmynsfw.com 1 day ago
aesthelete@lemmy.world 2 days ago
Yeah they probably fed it a bunch of legitimate on/off content as well as stuff from people who used to do make nudes from celebrity photos with sheer / skimpy outfits as a creepy hobby.
nullroot@lemmy.world 2 days ago
Also csam in training data definitely is a thing
Allero@lemmy.today 2 days ago
Honestly, I’d love to see more research on how AI CSAM consumption affects consumption of real CSAM and rates of sexual abuse.
Because if it does reduce them, it might make sense to intentionally use datasets already involved in previous police investigations as training data. But only if there’s a clear reduction.
(Police has already used some materials, with victims’ consent, to crack down on CSAM sharing platforms in the past).
NikkiDimes@lemmy.world 2 days ago
Or we could like…not
VoteNixon2016@lemmy.blahaj.zone 1 day ago
No.
bookmeat@lemmynsfw.com 1 day ago
The idea is that to generate csam there was harm done to get the training data. This is why it’s bad.