Comment on Teen deepfake victim pushes for federal law targeting AI-generated explicit content
TwilightVulpine@lemmy.world 9 months agoOther than vague slippery slope fearmongering I don’t see how banning the creation and distribution of deepfake porn is going to make AI monopolized corporations. If have your own personally trained and run AI model, you have complete control of what sort of content it’s generating. Why would you have issues with deepfake porn laws if you are not generating and hosting that content?
It just doesn’t add up, there’s some logical leap here that seems almost on the level of conspiracy theories. As much as governments do tend to favor corporations over regular people there is nothing so far even vaguely suggesting that AI would be so profoundly restricted that only corporations could use it. In fact, what has been described of what is proposed so far does not target the technology at all, only the users who engage in this kind of bad conduct.
But I profoundly disagree with this “nothing to be done about it”. How would fighting it be worse than letting people suffer for it? It’s not like drugs where the main person who might have issues is the user themselves, this affects unrelated vulnerable people.
If it is identified who is making deepfake porn and where it’s being hosted, it can be taken down. You could argue that not every single responsible person will be identified, but it might still be enough to diminish the prevalence and number of victims. And to the point that the remaining ones will have to be sneaky about it, that still might lead to less harassment to the victims.
You compare it to the war on drugs. Meanwhile I think of the rise of the automobile, with people crying that seat belts and traffic lights were ruining their freedom and “there’s nothing to be done” about people dying in car crashes.
curiousaur@reddthat.com 9 months ago
If everyone could create their own, and just run it locally, explain how the laws could be enforced?
aesthelete@lemmy.world 9 months ago
Aguing that since you do a crime with a tool, outlawing the crime outlaws the tool is a bad argument. Outlawing murder doesn’t outlaw knives.
As far as enforcement, it may be enforced with varying degrees of success but the argument that someone may get away with the crime also isn’t a reason not to make it a crime.
If someone created deep fakes using locally run models, rubbed one out and then deleted everything they probably wouldn’t be caught…but largely who cares that they didn’t? It’s the harm to others that it causes that you would largely like to prevent, and if a person didn’t distribute the image at all them “getting away with it” doesn’t matter much.
curiousaur@reddthat.com 9 months ago
You conceded that no one cares if someone makes images locally then deletes them. But that’s how they’re all going to be made shortly.
Currently folks are sharing them because not everyone has the means to create them, some folks do, and share what they’ve made.
Once litterally every can just make them the moment they want to, no one will be sharing. Everyone will fall under that use case that you admitted no one would care about, which is exactly what I’ve been saying. It’s 1. futile to try to stop, and 2. going to become so wide spread that we as a society will stop caring about it.
aesthelete@lemmy.world 9 months ago
I do not think this is true. There are reasons to distribute them other than to have a wank and delete them immediately afterwards.