By changing one pixel it’s no longer signed by the original author. What are you trying to say?
Comment on A.I. Video Generators Are Now So Good You Can No Longer Trust Your Eyes
danhab99@programming.dev 2 weeks agoThe NFTs tried to solve this problem already and it didn’t work. You can change the hash/sig of a video file by just changing one pixel on one frame, meaning you just tricked the computer, not the people who use it.
Kissaki@feddit.org 1 week ago
danhab99@programming.dev 1 week ago
Exactly that, if I change a pixel then the cryptographic signature breaks
lightsblinken@lemmy.world 1 week ago
so try again? also: if a pixel changes then it isn’t the original source video, by definition. being able to determine that it has been altered is entirely the point.
TheBlackLounge@lemmy.zip 1 week ago
The point was to sign AI footage so you know what’s fake. NFTs can be used as a decentralized repository of signatures. You could realistically require the companies to participate, but the idea doesn’t work because you can edit footage so it doesn’t match the signature. More robust signatures exist, but none is good enough, especially since the repo would have to be public.
Signing real footage makes even less sense. You’d have to trust everybody and their uncle’s signature.
dragonfly4933@lemmy.dbzer0.com 1 week ago
The signing keys could be published to DNS, for better or worse.
TheBlackLounge@lemmy.zip 1 week ago
What would that solve? NFTs don’t have to be powerhungry proof of work, that was just for the monkeys. The public ledger part of this is not the problem.