Nothing
Comment on Google will use hashes to find and remove nonconsensual intimate imagery from Search
AbouBenAdhem@lemmy.world 1 month ago
First I’ve heard of StopNCII… what’s to stop it from being abused to remove (say) images of police brutality or anything else states or corporations don’t want to be seen?
- SnoringEarthworm@sh.itjust.works 1 month ago- luisgutz@feddit.uk 1 month ago- The one they dropped years ago? - Lost_My_Mind@lemmy.world 1 month ago- Thats the one! 
- pimento64@sopuli.xyz 1 month ago- Yes, I, too, understood the point of the comment. 
 
- AbouBenAdhem@lemmy.world 1 month ago- Ah yes—the only known force weaker than gravity. - Alexstarfire@lemmy.world 1 month ago- Where do internal investigations fall? 
 
- Alexstarfire@lemmy.world 1 month ago
 
GasMaskedLunatic@lemmy.dbzer0.com 1 month ago
Literally nothing. It will be applied more nefariously after it’s been proven capable.
FreedomAdvocate@lemmy.net.au 1 month ago
You guys are all acting like this “technology” is new lol. It’s the exact same way that all of the big companies detect CSAM - they have databases of hashes of known CSAM images, and every time you upload a file to their servers they hash your image and compare it to their database. If your uploads get a few matches, they flag your account for manual investigation.
All this is doing is applying the same process for other types of images - non consensual intimate images, or “revenge porn” as it’s more commonly known.
CSAM has systems in place to prevent abuse in the way you mention, as it uses databases managed by separate companies all over the world, and it has to match on multiple databases precisely to stop it from being abused that way. I would assume this is the same.