And you can’t ever post process images.
Comment on Block chain to stop AI scams.
SomeoneSomewhere@lemmy.nz 1 day ago
Images, text etc can be generated entirely offline and independently. There is nothing to force the image to be attached to the block chain either directly or as a fingerprint.
You would have to do the opposite: when you take a picture or video (or write some text?), as it is recorded, the camera chipset signs the image/video using TPM-esque hardware, proving (ish) that it was captured by a real camera sensor.
The issue is that it’s pretty close to mandatory doxxing.
rikudou@lemmings.world 1 day ago
SomeoneSomewhere@lemmy.nz 2 hours ago
Theoretically you could include the original signed unprocessed image (or make it available NFT-style) and let the viewer decide whether the difference to post-processed image is reasonable or unreasonable.
It would however make it impossible to partially censor images without giving away the non-AI proof, unless you had a trusted third party ™ verify the original and re-sign the censored version.
A ‘view cryptographically signed original’ button next to every instagram post would be complete LOL, though.
rikudou@lemmings.world 1 hour ago
That seriously can’t ever work on the technical level. You can simply remove metadata from an image.
To be able to even remotely be able to do what you want, you’d have to create a new image format where this is unremovable and make it impossible to convert such images to other formats.
That would probably need to involve DRM, I don’t think it’s possible without that. And still there would be people who’d either crack your DRM or simply caught the direct output of the graphics card and they’ll have your image in an editable format.
In short it’s impossible to do.
pugnaciousfarter@literature.cafe 9 hours ago
The point was to mitigate the harm I suppose not entirely stop it.
As for the mandatory dpxxing, you’d just be registering on the block chain, but wouldn’t give away any other information.
So the only info that you give away is - whether something is real or not.
SomeoneSomewhere@lemmy.nz 2 hours ago
Are you talking about the AI generator registering on the blockchain? Because there is essentially no incentive for them to do so and every incentive for them not to.
If you mean genuine camera images being registered on the blockchain, that would give away at minimum the time the image was taken, and probably what kind of device it was taken with and all other images taken by the same user. That’s a lot of data.