Comment on ‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity
TORFdot0@lemmy.world 11 months agoIt’s the sexualization of people without consent that’s a problem. Maybe casual nudity shouldn’t a problem but it should be up to the individual to whom they share that with. And “nudify” ai models go beyond casual, consensual nudity and into sexual objectification and harassment if used without consent.
KairuByte@lemmy.dbzer0.com 11 months ago
I want to point out one slight flaw in your argument. Nudity isn’t needed for people to sexually objectify you. And even if it was, the majority of people are able to strip you down in their head no problem.
There’s a huge potential for harassment though, and I think that should be the main concern.
TimewornTraveler@lemm.ee 11 months ago
first, relevant xkcd xkcd.com/1432/
second,
do you really think that makes it less bad? that it’s opt-in?
apparently this app helps them too