They think that the AI is smart enough to deduce from the pixels around it what the original face must have looked like, even though there’s actually no reason why there should be a strict causal relationship between those things.
Comment on Epstein Files: X Users Are Asking Grok to 'Unblur' Photos of Children
ToTheGraveMyLove@sh.itjust.works 21 hours ago
Are these people fucking stupid? AI can’t remove something hardcoded to the image. The only way for it to “remove” it is by placing a different image over it, but since it has no idea what’s underneath, it would literally just be making up a new image that has nothing to do with the content of the original. Jfc, people are morons.
mfed1122@discuss.tchncs.de 13 hours ago
Pyr_Pressure@lemmy.ca 21 hours ago
There was someone who reported that due to the incompetence of whitehouse staffers, some of the Epstein files had simply been “redacted” in ms word by highlighted the text black, so people were actually able to remove the reactions by turning the pdf back into word and removing the black highlighting to reveal the text.
Who knows if some of the photos might be the same issue.
KyuubiNoKitsune@lemmy.blahaj.zone 20 hours ago
That’s, not how images like png or jpgs work.
unmagical@lemmy.ml 19 hours ago
In the case of what wound up on Roman Numeral Ten (formerly twitter) that’s correct, but given the actual PDF dump from the gov, if they just slapped an annotation on top of the image it’ll be possible to remove it and reveal what’s underneath.
KyuubiNoKitsune@lemmy.blahaj.zone 19 hours ago
I didn’t realise that they released the images as pdfs too.
unmagical@lemmy.ml 19 hours ago
It was simpler than that. You can just copy the black highlighter text and paste it anywhere.
klymilark@herbicide.fallcounty.omg.lol 18 hours ago
“Hackers used advanced hacking to unredact the Epstein files!” - Actual headline. The “hackers” did just Ctrl+A, Ctrl+C, opens word processor, Ctrl+V
bcgm3@lemmy.world 18 hours ago
Ctrl+A, Ctrl+C, opens word processor, Ctrl+V
DID YOU JUST DOWNLOAD A VIRUS ON MY KEYBOARD?
PostaL@lemmy.world 15 hours ago
Hey! Cut it out! If those people could read, they’d be very upset!
pkjqpg1h@lemmy.zip 20 hours ago
Actually, there is a short video on that page that explains this with examples
ToTheGraveMyLove@sh.itjust.works 18 hours ago
Video ≠ article
usualsuspect191@lemmy.ca 20 hours ago
The black boxes would be impossible, but there are some types of blur that keep enough of the original data they can be undone. There was a pedofile that used a swirl to cover his face in pictures and investigators were able to unswirl the images and identify him.
With how the rest of it has gone it wouldn’t surprise me if someone was incompetent enough to use a reversible one, although I have doubts Grok would do it properly.
BarneyPiccolo@lemmy.today 20 hours ago
Several years ago, authorities were searching the world for a guy who had been going around the world, molesting children, photographing them, and distributing them on the Internet. He was often in the photos, but he had chosen to use some sort of swirl blur on his face to hide it. The authorities just “unswirled” it, and there was his face, in all those photos of abused children.
They caught him soon after.
Barracuda@lemmy.zip 19 hours ago
A swirl is a distortion that is non-destructive. Am anonymity blur averages out pixels over a wide area in a repetitive manner, which destroys information. Would it be possible to reverse? Maybe a little bit. Maybe one pixel out of every %, but there wouldn’t be any way to prove the accuracy of that pixel and there would be massive gaps in information.
altkey@lemmy.dbzer0.com 18 hours ago
Swirl is destfuctive like almost everything in raster graphics with recompressing, but unswirling it back makes a good approximation in somehow reduced quality. If the program or a code of effect is known, e.g. they did it in Photoshop, you just drag a slider to the opposite side. Coming to think of it, it could be a nice puzzle in an adventure game or one another kind of captcha.
Barracuda@lemmy.zip 12 hours ago
You’re right. I meant more by “non-destructive” that it is, depending on factors like intensity and known algorithm, reversible.
floquant@lemmy.dbzer0.com 14 hours ago
Yeah, but this type of machine learning and diffusion models used in image genAI are almost completely disjoint
usualsuspect191@lemmy.ca 13 hours ago
Agree with you there. Just pointing out that in theory and with the right technique, some blurring methods can be undone. Grok most certainly is the wrong tool for the job.
priapus@piefed.social 17 hours ago
This is true that some blurs could be undone, but the ones used in the files are definitely destructive and cannot be undone. Grok and any other image generation tool is also definitely not capable of doing it. It requires specific algorithms and knowledge of how it was blurred to begin with, models simply guess what it should look like.