Comment on Man Arrested for Sharing Copyright Infringing Nude Scenes Through Reddit.
General_Effort@lemmy.world 2 months agoThat’s easy. The movie studios know what post-production went into the scenes and have the documents to prove it. They can easily prove that such clips fall under deepfake laws.
Y’all need to be more cynical. These lobby groups do not make arguments because they believe in them, but because it gets them what they want.
jacksilver@lemmy.world 2 months ago
I was responding to an above comment. The guy who was arrested in op’s article was posting clips from movies (so not deep fakes).
That being said, for deepfakes, you’d need the original video to prove it was deepfaked. Additionally, you’d then probably need to prove they used a real person to make the deep fake. Nowadays it’s easy to make “fake” people using AI. Not sure where the law sits on creating deepfakes of fake people who resemble other people.
General_Effort@lemmy.world 2 months ago
I didn’t make the point clear. The original scenes themselves, as released by the studio, may qualify as “deepfakes”. A little bit of digital post-processing can be enough to qualify them under the proposed bills. Then sharing them becomes criminal, fair use be damned.