Yeah I think people* will come to understand that not all images are real, just like they came to understand the not all headlines are real
*some people
Comment on OpenAI, Google will watermark AI-generated content to hinder deepfakes, misinfo
dreadedsemi@lemmy.world 1 year ago
The cat is out. You can’t stop it. It’s open source. I think the more fakes out there might desensitize people and make them less inclined to believe anything. Though my experience on Facebook isn’t encouraging.
Yeah I think people* will come to understand that not all images are real, just like they came to understand the not all headlines are real
*some people
photonic_sorcerer@lemmy.dbzer0.com 1 year ago
We need something to verify that media is what it claims to be. Will CCTV footage be admissable in court in the future? Maybe we should be looking into NFTs or the like to verify real recordings.
masonlee@lemmy.world 1 year ago
Here some big names are working on a standard for chaining digital signatures on media files: c2pa.org.
Their idea is that the first signature would come from the camera sensor itself, and every further modification adds to the signature chain.
excel@lemmy.megumin.org 1 year ago
This sounds like a job for digital signatures using traditional certificate authority infrastructure, which has been around for decades. Think digitally signed emails or software installers… Seems like NFTs made society collectively forget about every other form of encryption.
Also keep in mind that you can only ever verify the source, not the truthfulness of the data (e.g. you can prove that the image was created by New York Times and hasn’t been altered since then, but you will never be able to prove that NYT themselves didn’t fake it in some way). Verifying the source is obviously still useful though, so it seems dumb that digital signatures aren’t more ubiquitous.
My guess is that they’ve been resisted for privacy reasons.