Comment on ‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity
ReluctantMuskrat@lemmy.world 1 year agoIt’s a problem for adults too. Circulating an AI generated nude of a female coworker is likely to be just as harmful as a real picture. Just as objectifying, humiliating and hurtful. Neighbors or other “friends” doing it could be just as bad.
It’s sexual harassment even if fake.
Eezyville@sh.itjust.works 1 year ago
I think it should officially be considered sexual harassment. Obtain a picture of someone, generate nudes from that picture, it seems pretty obvious. Maybe it should include intent to harm, harass, exploit, or intimidate to make it official.