You may be able to prove that a photo with certain metadata was taken by a camera (my understanding is that that’s the method), but you can’t prove that a photo without it wasn’t, because older cameras won’t have the necessary support, and wiping metadata is trivial anyway. So is it better to have more false negatives than false positives? Maybe. My suspicion is that it won’t make much difference to most people.
Google Search To Show If An Image Is AI Generated, Edited Or Taken With Camera.
Submitted 1 month ago by 101@feddit.org to technology@lemmy.world
https://www.seroundtable.com/google-search-image-labels-ai-edited-38082.html
Comments
nyan@lemmy.cafe 1 month ago
T156@lemmy.world 1 month ago
A fair few sites will also wipe metadata for safety reasons, since photo metadata can include things like the location where the photo was taken.
WolfLink@sh.itjust.works 1 month ago
Even if you assume the images you care about have this metadata, all it takes is a hacked camera (which could be as simple as carefully taking a photo of your AI-generated image) to fake authenticity.
cmnybo@discuss.tchncs.de 1 month ago
tal@lemmy.today 1 month ago
looks dubious
The problem here is that if this is unreliable – and I’m skeptical that Google can produce a system that will work across-the-board – then you have a synthesized image that now has Google attesting to be non-synthetic.
AbouBenAdhem@lemmy.world 1 month ago
The problem here is that if this is unreliable…
And the problem if it is reliable is that everyone becomes dependent on Google to literally define reality.
xenoclast@lemmy.world 1 month ago
Fun fact about AI products (or any gold rush economy) it doesn’t have to work. It just has to sell.
SchmidtGenetics@lemmy.world 1 month ago
I guess this would be a good reason to include some exif data when images are hosted on websites, one of the only ways to tell an image is true from my little understanding.
conciselyverbose@sh.itjust.works 1 month ago
No, the default should be removing everything but maybe the date because of privacy implications.
stupidcasey@lemmy.world 1 month ago
Lol, knowing the post processing done with your IPhone this whole thing sounds like an actual joke, does no one remember the fake moon incident? Your photos have been Ai generated for years and no one noticed, no algorithm on earth could tell the difference between a phone photo and an Ai photo because they are the same thing.
remer@lemmy.world 1 month ago
Are you saying the moon landing was faked or did I miss something?
stupidcasey@lemmy.world 1 month ago
You absolutely missed everything, the moon is fake literally… when you take a picture of the moon your camera uses AI photo manipulation to change your garage picture to a completely Ai generated image because taking pictures of the moon is actually pretty difficult so it makes pictures look much better and in %99 of cases it is better but in edge cases like trying to take a picture of something flying in front of the moon like the ISS or a cloud it is not, also it may cause issues if you try to introduce your photos in court because everything you take is inherently doctored.
restingboredface@sh.itjust.works 1 month ago
It’s of course troubling that AI images will go unidentified through this service (I am also not at all confident that Google can do this well/consistently).
However I’m also worried about the opposite side of this problem- real images being mislabeled as AI. I can see a lot of bad actors using that to discredit legitimate news sources or stories that don’t fit their narrative.
Dagamant@lemmy.world 1 month ago
I watched a video on methods for detecting AI generation in images. One of the methods was comparing the noise on different color channels. Cameras have different noise in different channels while AI doesn’t. There is also stuff like JPG compression artifacts in other image formats.
So there are technical solutions to it but I wouldn’t know how to automate them.
AbouBenAdhem@lemmy.world 1 month ago
Those would be easy things to add, if you were trying to pass it off as real.
WolfLink@sh.itjust.works 1 month ago
Take a high-quality AI image, add some noise, blur, and compress it a few times.
Or, even better, print it and take a picture of the print out, making sure your photo of the photo is blurry enough to hide the details that would give it away.
Rob200@lemmy.autism.place 1 month ago
apfelwoiSchoppen@lemmy.world 1 month ago
So they are going to use AI to detect AI.
hemko@lemmy.dbzer0.com 1 month ago
They’re going to use AI to train AI*
So nothing new here
apfelwoiSchoppen@lemmy.world 1 month ago
Use AI to train AI to detect AI, got it.