This problem already existed since people got good enough with MS paint. Then got next level bigger with photoshop and is now simply on the next level once more with ai generation.
It’s bad, and it’s also not something new and not caused by AI but by humans misusing a tool. Again.
taladar@sh.itjust.works 13 hours ago
Lets be perfectly honest here, especially for pictures this has been possible for at least 20 years.
What is ruining people’s lives is the obsession a conservative society has with demonizing nudity and sex.
MangoCats@feddit.it 1 hour ago
55 years: www.imdb.com/title/tt0720250/
wizardbeard@lemmy.dbzer0.com 13 hours ago
Lets not pretend that lowering the barrier to making fake nudes hasn’t changed the situation either.
taladar@sh.itjust.works 12 hours ago
In a way it has improved it. It makes it more plausible to claim that any nudes that do show up of you are fake.
altphoto@lemmy.today 10 hours ago
If there was no shame in being nude, then there wouldn’t be a problem… Imagine finding out that a guy you know has hundreds of portraits of you. Portraits you were never in! Absolutely not terrifying.
MoonlightFox@lemmy.world 12 hours ago
I agree in principle, society is demonizing nudity and sex. This has got to change. Society needs to change in order to fix this and many other issues related to sex and nudity.
As long as it affects a persons reputation and their standing, this is a problem. Any person can harm someone with this technology, and as a society we can not accept that.
Most people could not make a decent fake sex tape with any person in the world with low effort before. Now they can.
Should creating deepfakes for personal consumption be legal/illegal? Distribution is the real problem. The rest is fantasizing with tools more or less. Some people will understandably not like it if they find out other people fantasize about them, but that is close to thought crime. What is acceptable? Is a stickdrawing with names too much? What if I am really good at realistic drawings? What if I draw many images in a book and make a physical animation of ouf it? Is the limit anything outside my head? What if I draw a politician fellating another one and distribute it as art/satire?
The short term solution is to ban deepfakes, the long term is probably something else, but I am not sure what. There is not inherently any actual abuse in deepfakes, there is no actual sex either. So it’s a reputational/honor and disgust thing. These things still matter a lot in societies, so we can’t ignore it either.
BigRigButters@lemm.ee 5 hours ago
Also people fixate and idolize them making it worse
DeathsEmbrace@lemm.ee 11 hours ago
Religion would like to have a word
werty@sh.itjust.works 10 hours ago
Prude shaming is exactly the same as slut shaming. Implicit in your comment is the notion that women ought to be perfectly fine with deepfake porn of themselves being created and viewed by anyone, but society has made them prudish through the demonization of nudity sex.
Women are people and have rights and feelings that should not be ignored simply to serve the male sex drive, and a great many women do not want to be deepfake porn stars. If you disagree then say so. Don’t hide behind the notion that society has robbed women of their desire to serve your fantasy through the demonization of sex. Sexual liberation includes the right to not have or be involved in sexual activity if one chooses. Prude shaming, like yours, is designed to remove that right.
No wonder 4b took off in Korea.
cardfire@sh.itjust.works 2 hours ago
Succinct, says the quiet part outside, it has me exploring my own assumptions.
Thanks for being a part of this community.
taladar@sh.itjust.works 8 hours ago
Society subjects people to a lot of things that people aren’t “perfectly fine with” but only some of them have wider implications for reputation, career chances, ostracizing,… and that is the main problem with this kind of technology.
Your assumption that people need deepfake technology to fantasize about people sexually who have no interest in being on the receiving end of that is at best naive and at worst arguing in bad faith.
arararagi@ani.social 12 hours ago
It’s just like piracy, when It becomes too easy it’s when companies and government begins the crackdown.
Theoretically possible since Photoshop, but you had to be pretty fucking good at it, now even teens are making deepfakes of their teachers and classmates.