Comment on Teen deepfake victim pushes for federal law targeting AI-generated explicit content
curiousaur@reddthat.com 9 months agoYou conceded that no one cares if someone makes images locally then deletes them. But that’s how they’re all going to be made shortly.
Currently folks are sharing them because not everyone has the means to create them, some folks do, and share what they’ve made.
Once litterally every can just make them the moment they want to, no one will be sharing. Everyone will fall under that use case that you admitted no one would care about, which is exactly what I’ve been saying. It’s 1. futile to try to stop, and 2. going to become so wide spread that we as a society will stop caring about it.
aesthelete@lemmy.world 9 months ago
I do not think this is true. There are reasons to distribute them other than to have a wank and delete them immediately afterwards.
curiousaur@reddthat.com 9 months ago
Like what? Why share something when anyone curious to see it can instantly generate their own?
aesthelete@lemmy.world 9 months ago
I’m curious as to why you cannot come up with any yourself, but here are a few from the top of my head: to pass them off as authentic (likely for clout purposes), to have a laugh with the boys about it, to collaborate with others on them, and to distribute them to harass, ridicule, or disparage the target of them.
Degenerates exist in lots of shapes and forms, and not all degenerates will have enough of a sense of shame to be degenerates privately or to even know they are being degenerates at all.
curiousaur@reddthat.com 9 months ago
I don’t think you’re properly understanding the paradigm shift that’s coming with these models being open source and widely available while wearable AR smart glasses get better.
“You know Sharon is HR, look at this scandalous photo of her.”
“Uh, I’m seeing a live generated porno of everyone in this room right now, why would I care about that.”