Comment on Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy
uriel238@lemmy.blahaj.zone 1 year agoLet’s say my grandson came to a realization that he was actually my granddaughter. She grows her hair long. She practices with make-up and gets some cute dresses and skirts, and is totally into it.
Now Apple knows.
Any any law-enforcement interests that think its wrong or abusive by fiat can force apple to let them know.
Same, if my grandkid decides they are pagan and go from wearing a cross to wearing a pentacle.
Same if law enforcement notices that they are caramel colored, that mom is Germanic pale and dad is dark brown.
The US is a society in which neither law nor law enforcement are on our side, and can at any time decide that arbitrary life shit is worthy of sending a SWAT team to collect us. And if the GOP is determined to make it worse.
Kelsenellenelvial@lemmy.ca 1 year ago
Not really. The plan that Apple backpedaled on was to compare hashes photos on device to hashes of known CSAM material. They wouldn’t see any user-generated photos unless they was a hash collision. Other companies have been known to report false positives on user-generated photos and delete accounts with no process to recover them.
phillaholic@lemm.ee 1 year ago
They published a white paper on it. It would have taken many detected examples before they did anything about it. It’s not strictly a hash as it’s not looking for exact copies but similar ones. Collisions have been proven, but afaik they are all reverse engineered. Just Grey blobs of nonsense that match CSAM examples. I don’t recall hearing about someone’s random taken photo matching with anything, but correct me if I’m wrong.
Kelsenellenelvial@lemmy.ca 1 year ago
True, it’s hash-like in that the comparison is using some mathematic representation of the source material. It was intended to be a little fuzzy so it would still catch minor alterations like cropping, watermarks, rendering to a new format, etc…
The example I heard of was someone that was using an app for a remote doctors appointment. The doctor requested photos of the issue, a rash in the genital area of a minor, supposedly one included an adult hand touching the area involved. That photo ended up in Google’s cloud service where it was flagged, reported to law enforcement, and that users while Google account was frozen. The investigation quickly confirmed the innocence of the photo, and provided official documentation of such, but last I heard Google would not release the account.
phillaholic@lemm.ee 1 year ago
Google has unencrypted access to your files to do whatever they want with, do we know this was the same CSAM system or one of Google internal ones? Google Photos does their face and object scanning on the cloud where apple does it on device.
uriel238@lemmy.blahaj.zone 1 year ago
This assumes the program stays that way. Much the way Google promised no human would look at (or be able to look at) the data set, we dont have an external oversight entity watching over Apple.
And then there’s the matter of mission creep, much the way the NSA PRISM program was supposed to only deal with foreign threats to national security (specifically Islamist terrorism) yet now it tells local precincts about large liquidatable assets that can be easily seized.
Even if it only looks as hash codes, it means law enforcement can add its own catalog of hashes to isolate and secure, say content that is embarrassing to law enforcement, like videos of police gunning down unarmed, unresisting suspects in cold blood, which are challenged only when the event is captured on a private smartphone.