Comment on ‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity
KairuByte@lemmy.dbzer0.com 11 months agoI mean, you’ve been able to do a cursory search and get dozens of “celeb lookalike” porn for many years now. “Scarjo goes bareback” isn’t hard to find, but that ain’t Scarjo in the video. How is this different?
shuzuko@midwest.social 11 months ago
This is different because, to a certain extent, people in the public eye can expect, anticipate, and react to/suppress this kind of thing. They have managers and PR people who can help them handle it in a way that doesn’t negatively affect them. Billy’s 13 year old classmate Stacy doesn’t have those resources and now he can do the same thing to her. It’s on a very different level of harm.
KairuByte@lemmy.dbzer0.com 11 months ago
Billy doesn’t need a nudify app to imagine Stacy naked. Not to mention, images of a naked 13 year old are illegal regardless.
azertyfun@sh.itjust.works 11 months ago
Why are you pretending that “nudify apps” are produce ephemeral pictures equivalent to a mental image? They are most definitely not.
Underage teenagers already HAVE shared fake porn of their classmates. It being illegal doesn’t stop them, and as fun as locking up a thirteen year old sounds (assuming they get caught, prosecuted, and convicted) that still leaves another kid traumatized.
KairuByte@lemmy.dbzer0.com 11 months ago
So if illegality doesn’t stop things from happening… how exactly are you stopping these apps from being made?
Sweetpeaches69@lemmy.world 11 months ago
Just as the other people in this made up scenario don’t need an app to imagine Scarlet Johansen naked. It’s a null point.
CleoTheWizard@lemmy.world 11 months ago
I think most of this is irrelevant because the tool that is AI image generation is inherently hard to limit in this way and I think it will be so prevalent as to be hard to regulate. What I’m saying is: we should prepare for a future where fake nudes of literally anyone can be made easily and shared easily. It’s already too late. These tools, as was said earlier, already exist and are here. The only thing we can do is severely punish people who post the photos publicly. Sadly, we know how slow laws are to change. So in that light, we need to legislate based on long term impact instead of short term reactions.
KairuByte@lemmy.dbzer0.com 11 months ago
And?… There’s a major difference between “a lookalike of a grown adult” and “ai generated child porn” as im sure you’re aware. At no point did anyone say child porn was going to be legal, until the person I was replying to brought it up as a strawman argument. ¯\_(ツ)_/¯