Comment on Instagram Advertises Nonconsensual AI Nude Apps
BreakDecks@lemmy.ml 6 months agoGenerative AI is being used quite prominently for the purposes of making nonconsensual pornography. Just look at the state of CivitAI, the largest marketplace of Stable Diffusion models online. It pretends to be a community for Machine Learning professionals, but behind the scenes it’s laying the groundwork for all of the problems we’re seeing right now. There’s not an actress or female celebrity that doesn’t have a TI or LoRA trained on their likeness - and the galleries don’t hold back on showing you what these models can do.
At least Photoshop never gained the specific reputation of being a tool for making fake porn, but the GenAI community is leaving no doubt that this is a major use case for image models.
Even HuggingFace turns a blind eye to pornifying models and lolicon datasets, and they’re basically the GitHub of AI models…
ArmokGoB@lemmy.dbzer0.com 6 months ago
Knowledge of fission is often applied to make nuclear bombs, but also to generate nuclear power. We shouldn’t blame AI as a whole for this just because some creeps use it for shitty applications.
BreakDecks@lemmy.ml 6 months ago
That’s kinda why I brought up specific key players and now I consider them complicit. If you don’t want AI to be blamed as a whole, you should want those key players to behave ethically, or they’ll poison public perception of AI as a whole.