smh, back in my day we just cut out pictures of the faces of woman we wanted to see naked, and glued them ontop of (insert goon magazine of choice)
Comment on Grok AI still being used to digitally undress women and children despite suspension pledge
phoenixz@lemmy.ca 3 days agoIt’s technically possible because AI doesn’t exist. The LLM’s we have do exist and these have no idea what it’s doing.
It’s a database that can parse human language and put pixels together from requests. It has no such concept as child pornography, it’s just putting symbols together in a way it learned before that happen to form a child pornography picture
IronBird@lemmy.world 2 days ago
Perspectivist@feddit.uk 3 days ago
phoenixz@lemmy.ca 2 days ago
Not AI as the common people think it is, I guess I should have cleared that up.
AI as we currently have it is little more than a specialized database
prac@lemmy.world 3 days ago
This is a lot of words to basically say the developers didn’t bother to block illegal content. It doesn’t need to ‘understand’ morality for the humans running it to be responsible for what it produces.
Zoomboingding@lemmy.world 3 days ago
Neither of you are wrong. LLMs are wild uncaged animals. You’re asking why we didn’t make a cage, and they’re saying we don’t even know how to make one yet.
muusemuuse@sh.itjust.works 3 days ago
Because being irresponsible is financially rewarding. There’s no downside. Just golden parachutes
HasturInYellow@lemmy.world 2 days ago
We as a society have failed to implement those consequences. When the government refused, we should have taken up the mantle ourselves. It should be a mark of great virtue to have the head of a CEO mounted over your fireplace.
Honytawk@feddit.nl 2 days ago
Yeah, how hard is it to block certain keywords from being added to the prompt?
We’ve had lists like that since the 90’s. Hardly new technology. Even prevents prompt hacking if you’re clever about it.
phoenixz@lemmy.ca 2 days ago
Eh, no?
It’s really REALLY hard to know what content is, and to identify actual child porn even remotely accidentally, even with AI