Comment on Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy
mahony@lemmy.world 1 year ago
The client side scanning of contents of your phone is the most 1984 thing you will hear.
Comment on Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy
mahony@lemmy.world 1 year ago
The client side scanning of contents of your phone is the most 1984 thing you will hear.
phillaholic@lemm.ee 1 year ago
It was client side scanning if you chose to upload those files to iCloud. The equivalent of having your ID checked before you enter a club.
rikonium@discuss.tchncs.de 1 year ago
Yes, however my (Others may have other concerns, this is just off the top of my head) chief concern was the breaking a major barrier - in that explicitly user-hostile code would be running on the device itself, one I own. I’d say it’s more of the equivalent of club employees entering your home to check your ID prior to, or during your club visit, and using your restroom/eating a snack while they’re there. (scanning would use “your” device’s resources)
There’s also the trivial nature of flipping the require_iCloud_photos=“true” value to “false” whether by intention or by accident. I have an open ticket with Apple support where my Apple Maps saved locations, favorites, guides, Home, reports, reviews ALL vanished without a trace. Just got a callback today saying that engineering is aware of the problem and that it’s expected to be resolved in the next iOS update. I’m the meantime, I’m SOL, so accidents and problems can and do happen, nor is Apple the police.
And on top of that there’s also concerns of upstream perversion of the CSAM database for other purposes - after all, who can audit it to ensure it’s use for CSAM exclusively and who can add to it? Will those images from the device and database be pulled out for trials or would it be a “trust the machine, the odds of false positives are x%” situation? (I believe those questions might have been already answered when the controversy was flying but just a lot of cans of worms waiting with this.)
phillaholic@lemm.ee 1 year ago
The CSAM database isn’t controlled by Apple. It’s already in use practically everywhere. Apple tried to compromise between allowing private encrypted image storage at scale and making sure they aren’t a hot bed for CSAM. Their competitors just keep it unencrypted and scan it for content, which last time I checked is worse 🤷♂️
Natanael@slrpnk.net 1 year ago
But Apple still fetches that list of hashes and can be made to send an alternative list to scan for
uriel238@lemmy.blahaj.zone 1 year ago
Let’s say my grandson came to a realization that he was actually my granddaughter. She grows her hair long. She practices with make-up and gets some cute dresses and skirts, and is totally into it.
Now Apple knows.
Any any law-enforcement interests that think its wrong or abusive by fiat can force apple to let them know.
Same, if my grandkid decides they are pagan and go from wearing a cross to wearing a pentacle.
Same if law enforcement notices that they are caramel colored, that mom is Germanic pale and dad is dark brown.
The US is a society in which neither law nor law enforcement are on our side, and can at any time decide that arbitrary life shit is worthy of sending a SWAT team to collect us. And if the GOP is determined to make it worse.
Kelsenellenelvial@lemmy.ca 1 year ago
Not really. The plan that Apple backpedaled on was to compare hashes photos on device to hashes of known CSAM material. They wouldn’t see any user-generated photos unless they was a hash collision. Other companies have been known to report false positives on user-generated photos and delete accounts with no process to recover them.
phillaholic@lemm.ee 1 year ago
They published a white paper on it. It would have taken many detected examples before they did anything about it. It’s not strictly a hash as it’s not looking for exact copies but similar ones. Collisions have been proven, but afaik they are all reverse engineered. Just Grey blobs of nonsense that match CSAM examples. I don’t recall hearing about someone’s random taken photo matching with anything, but correct me if I’m wrong.
uriel238@lemmy.blahaj.zone 1 year ago
This assumes the program stays that way. Much the way Google promised no human would look at (or be able to look at) the data set, we dont have an external oversight entity watching over Apple.
And then there’s the matter of mission creep, much the way the NSA PRISM program was supposed to only deal with foreign threats to national security (specifically Islamist terrorism) yet now it tells local precincts about large liquidatable assets that can be easily seized.
Even if it only looks as hash codes, it means law enforcement can add its own catalog of hashes to isolate and secure, say content that is embarrassing to law enforcement, like videos of police gunning down unarmed, unresisting suspects in cold blood, which are challenged only when the event is captured on a private smartphone.
regalia@literature.cafe 1 year ago
You’re paying to reserve some space in their cloud to store your encrypted bits. If you exchange money for that space, then you’re entitled for it to be encrypted and private.
phillaholic@lemm.ee 1 year ago
Find me any place you don’t own that you can store your stuff that has no restrictions on what you can store there.
regalia@literature.cafe 1 year ago
Something like Proton Cloud, or a self hosted Nextcloud instance. If it’s encrypted, it’s nobody’s business.
player2@lemmy.dbzer0.com 1 year ago
phillaholic@lemm.ee 1 year ago
You’re thinking of Google, where they data mine you as their primary business model. Google Photos scan your photos for object recognition, what do you think that is? There’s no E2E there at all. Apple’s object detection is done on device. It amazes me that Apple got attacked about this when literally everyone else is just doing it without telling you and not offering encryption.