A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
This is only going to get easier. The djinn is out of the bottle.
JackGreenEarth@lemm.ee 7 months ago
That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.
Khrux@ttrpg.network 7 months ago
I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.
I wish everyone involved in this industry a very awful day.
sentient_loom@sh.itjust.works 7 months ago
Imagine hiring a hit man and then realizing he hired another hit man at half the price. I think the government should compensate them.
echo64@lemmy.world 7 months ago
The people being exploited are the ones who are the victims of this, not people who paid for it.
sentient_loom@sh.itjust.works 7 months ago
There are many victims, including the perpetrators.
IsThisAnAI@lemmy.world 7 months ago
Scan is another thing. Rick these people.
But fuck dude they stunt taking advantage of anyone buying the service. That’s not how the fucking world works. It turns out that even you have money you can post for people to do shit like clean your house or do an oil change.
NOBODY on that side of the equation are bring exploited 🤣
OKRainbowKid@feddit.de 7 months ago
In my experience with SD, getting images that aren’t obviously “wrong” in some way takes multiple iterations with quite some time spent tuning prompts and parameters.
M500@lemmy.ml 7 months ago
Wait? This is a tool built into stable diffusion?
In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.
SorteKanin@feddit.dk 7 months ago
It’s not like deep fake pornography is “built in” but Stable Diffusion can take existing photos and generate stuff on top of it. That’s kinda how it works really. The de-facto standard UI makes it pretty simple, even for someone who’s not too tech savvy: github.com/AUTOMATIC1111/stable-diffusion-webui