Hmm, interesting, thanks. Has anyone been charged or convicted with this law yet?
Comment on Grok’s ‘spicy’ video setting instantly made me Taylor Swift nude deepfakes
snausagesinablanket@lemmy.world 1 day agoUnder what law?
[Take it down act](On April 28, 2025, Congress passed S. 146, the TAKE IT DOWN Act, a bill that criminalizes the nonconsensual publication of intimate images, including “digital forgeries” (i.e., deepfakes), in certain circumstances.)
Ulrich@feddit.org 1 day ago
Cethin@lemmy.zip 9 hours ago
Definitely not convicted. That’d be some crazy speed.
However, your insistence that it hasn’t happened yet so can’t happen is insane. There has to be a first case in which it hadn’t happened before.
Ulrich@feddit.org 3 hours ago
However, your insistence that it hasn’t happened yet so can’t happen is insane
It would be insane if that was what I had insisted, but that didn’t happen.
michaelmrose@lemmy.world 1 day ago
Is providing it over a private channel to a singular user publication?
I suspect that you will have to directly regulate image generation
snausagesinablanket@lemmy.world 1 day ago
Its already being done to help prevent fake CSAM.
That should have been standard from the start.
Lucidlethargy@sh.itjust.works 23 hours ago
I don’t think anyone has any delusions that Twitter is private, not even DM’s.
michaelmrose@lemmy.world 26 minutes ago
It absolutely is private insofar as it is a channel between the software running on their end -> user who is operating the software. The lack of end to end encryption does not make it not private it makes it insecure which doesn’t speak whatsoever to the issue raised which is that creation of an image by a user isn’t likely to be considered publication until they share it.
It’s highly probable that keeping people from generating deep fake nudes requires additional law.