In the days after the US Department of Justice (DOJ) published 3.5 million pages of documents related to the late sex offender Jeffrey Epstein, multiple users on X have asked Grok to “unblur” or remove the black boxes covering the faces of children and women in images that were meant to protect their privacy.
Are these people fucking stupid? AI can’t remove something hardcoded to the image. The only way for it to “remove” it is by placing a different image over it, but since it has no idea what’s underneath, it would literally just be making up a new image that has nothing to do with the content of the original. Jfc, people are morons.
pkjqpg1h@lemmy.zip 3 weeks ago
They have no idea how this models work :D
pkjqpg1h@lemmy.zip 3 weeks ago
Image
cupcakezealot@piefed.blahaj.zone 3 weeks ago
biblically accurate cw casting
TheBat@lemmy.world 3 weeks ago
Barrett O’Brien
annoyed_onion@lemmy.world 3 weeks ago
Though it is 2026. Who’s to say Elon didn’t feed the unredacted files into grok while out of his face on ket 🙃
otter@lemmy.ca 3 weeks ago
It feels like being back on the playground
albbi@piefed.ca 2 weeks ago
Wait, what? My son has been using “googleplex” when he wants a really big number. I thought it was a weird word he made up. I guess it’s a thing….
red_tomato@lemmy.world 3 weeks ago
Enhance!
WanderingThoughts@europe.pub 3 weeks ago
Uncrop!
criss_cross@lemmy.world 2 weeks ago
It’s the same energy as “don’t hallucinate and just say if you don’t know the answer”
pkjqpg1h@lemmy.zip 2 weeks ago
and don’t forget “make no mistakes” :D
Armand1@lemmy.world 3 weeks ago
Or percentages