Camera Companies Fight AI-Generated Images With ‘Verify’ Watermark Tech::undefined
I guess this is better than nothing, but what happens if you take a photo of a generated photo? There are setups where the result will be impossible to tell that it’s a photo of a photo, and then you can have the camera digitally sign the fake photo as real.
qevlarr@lemmy.world 11 months ago
Great, DRM on my personal photos. Next they’re going to charge a subscription to view my own goddamn vacation pictures
GenderNeutralBro@lemmy.sdf.org 11 months ago
It’s not DRM. It’s like EXIF metadata. You can strip it anytime you want, and it will often get stripped in practice even if you don’t want (e.g. screenshots, naive conversions, metadata-stripping upload services, etc.). It’s entirely optional and does not require any new software to view the images, only to view and manipulate the metadata.
On its own, it doesn’t tell you much about the image except that specific people/organizations edited and approved it, with specific tools. If you don’t trust those people/orgs, then you shouldn’t trust the photo. I mean, even if it says it came straight from a Nikon camera…so what? People can lie with cameras, too.
I wrote a bit more about this in another thread at lemmy.sdf.org/comment/5812616 about a month ago, if you’re interested. I haven’t really played with it yet, but there’s open-source software out there you can try.
qevlarr@lemmy.world 11 months ago
You could implement it like that but I’m not convinced that’s the way this will go. The only way this will have mass adoption, I’m afraid, is if the tech giants can fleece us one way or another.