In the days after the US Department of Justice (DOJ) published 3.5 million pages of documents related to the late sex offender Jeffrey Epstein, multiple users on X have asked Grok to “unblur” or remove the black boxes covering the faces of children and women in images that were meant to protect their privacy.
Are these people fucking stupid? AI can’t remove something hardcoded to the image. The only way for it to “remove” it is by placing a different image over it, but since it has no idea what’s underneath, it would literally just be making up a new image that has nothing to do with the content of the original. Jfc, people are morons.
pkjqpg1h@lemmy.zip 1 month ago
They have no idea how this models work :D
pkjqpg1h@lemmy.zip 1 month ago
Image
cupcakezealot@piefed.blahaj.zone 1 month ago
biblically accurate cw casting
TheBat@lemmy.world 1 month ago
Barrett O’Brien
annoyed_onion@lemmy.world 1 month ago
Though it is 2026. Who’s to say Elon didn’t feed the unredacted files into grok while out of his face on ket 🙃
otter@lemmy.ca 1 month ago
It feels like being back on the playground
albbi@piefed.ca 1 month ago
Wait, what? My son has been using “googleplex” when he wants a really big number. I thought it was a weird word he made up. I guess it’s a thing….
red_tomato@lemmy.world 1 month ago
Enhance!
WanderingThoughts@europe.pub 1 month ago
Uncrop!
criss_cross@lemmy.world 1 month ago
It’s the same energy as “don’t hallucinate and just say if you don’t know the answer”
pkjqpg1h@lemmy.zip 1 month ago
and don’t forget “make no mistakes” :D
Armand1@lemmy.world 1 month ago
Or percentages