Taylor Swift AI images prompt US bill to tackle nonconsensual, sexual deepfakes::Bipartisan measure introduced in the US senate will allow victims in ‘digital forgeries’ to seek civil penalty against perpetrators
Shit, I better delete some of my Danny DeVito folders…
TheGrandNagus@lemmy.world 9 months ago
Unfortunate that it took a famous person being a victim of this before action started to be taken.
That said, will this do anything to help? People are still going to do it regardless.
In a weird way, could this tech not be a good thing? There are unfortunately plenty of stories of young people in particular becoming extremely stressed to the point of harming themselves over leaks of their nudes.
If in the future, they can be trivially dismissed as being easily-generated deepfakes (regardless of whether they actually are), would that not lessen the harm considerably? Would it not recalibrate our brains to care about it less?
I don’t know the answers to any of this. We’re going into uncharted territory.
GregorGizeh@lemmy.zip 9 months ago
lemmy.zip/comment/6691303
I actually argued similarly over here. This fake outrage because a celebrity was targeted serves nobody. Fakes will just get better, and legislation will always be on the back foot.
General_Effort@lemmy.world 9 months ago
I wonder what it will do to the stress levels of parents, knowing that their kid can be on the hook for $150k if they share some fake. Would any liability insurance cover this?
abhibeckert@lemmy.world 9 months ago
Would they? Last year a woman was awarded $1.2b in damages after her ex boyfriend distributed revenge porn.
How many people would hit that retweet button, if they knew it might lead to damages on that scale? Sure, some people would still do it - but not very many. And they’d probably only do it once.
TheGrandNagus@lemmy.world 9 months ago
Yes.
People break copyright and other IP laws all the time, for example.
Shit, torrenting a film carries a 10 year max prison sentence where I am. It doesn’t stop anybody.
And that case is a hell of a lot more than someone retweeting or upvoting a deepfake.
It covers someone constantly uploading porn of a partner and blackmailing them (even days before the court case), impersonating her online, doxxing her, and sending porn of her to her family members.
It also covers him illegally using her bank account to pay his bills and using her name and information to apply for loans in her name.
That case is a very, very, very, very different situation to someone making a Taylor Swift deepfake.