Comment on Apple watching & logging EVERY APP YOU OPEN [Louis Rossmann]
lemmyvore@feddit.nl 1 year agoApple applies E2E encryption for almost all iCloud data with Advanced Data Protection
They only started doing that in December, it has not rolled out to everyone and everything yet, and like you said it won’t cover everything even then — mail, contacts and calendar will not be included. (And they considered backdooring it for a while before they relented.)
Even the E2E aspect is misleading. The encryption ultimately relies on a password, which can be brute-forced because most people don’t use overly complex passwords for their iCloud account. Hardware keys are something Apple has only very recently made possible to use.
theverge.com/…/apple-end-to-end-encryption-icloud…
schneier.com/…/apple-is-finally-encrypting-icloud…
Bottom line, it would be more correct to say that Apple has recently made privacy improvements. But for the longest time they were nowhere near the privacy champion they styled themselves as.
octalfudge@lemmy.world 1 year ago
Apple’s stated reason for not covering mail, contacts and calendar is “Because of the need to interoperate with the global email, contacts, and calendar systems, iCloud Mail, Contacts, and Calendar aren’t end-to-end encrypted”. I think it’s worth mentioning that critical bit of context. support.apple.com/en-sg/guide/security/…/web. Apple does have to balance usability and security, though this might not be as secure / private as you or I would like.
I think it’s a little misleading to say they considered backdooring it. They intended to scan images for CSAM before uploading it to iCloud Photo Library. A lot of speculation was they wanted to E2EE photos but were worried about the reaction from the FBI and other bodies, given the FBI had pressured them on this before, and so settled on this compromise. If they had managed to do this, they wouldn’t be able to access the photos after they had been uploaded, hence, they had to scan them prior to the uploading.
They attempted to do this with a very complex (and honestly still relatively privacy-preserving) way of comparing perceptual hashes, but perhaps they realised (from the feedback accompanying the backlash) this could easily be abused by authoritarian governments, so they abandoned this idea.
I would assume that a company like Apple is getting significant pressure behind back doors, and they cater to an audience that is unforgiving for any slight reduction in performance or ease-of-use, and wants security features that are almost fully transparent to them. Given these constraints, I’m not sure they can improve much faster than what they’ve demonstrated. Smaller, open-source projects probably don’t have these constraints.