Apple is the best on privacy though right?
Apple to Analyze User Data on Devices to Bolster AI Technology.
Submitted 4 weeks ago by Tea@programming.dev to technology@lemmy.world
https://machinelearning.apple.com/research/differential-privacy-aggregate-trends
Comments
hobovision@lemm.ee 4 weeks ago
Reyali@lemm.ee 4 weeks ago
Tell me you didn’t read the article without telling me you didn’t read the article.
The entire thing is explaining how they are upholding privacy to do this training.
- It’s opt-in only (if you don’t choose to share analytics, nothing is collected).
- They use differential privacy (adding noise so they get trends, not individual data).
- They developed a new method to train on text patterns without collecting actual messages or emails from devices. (link to research on arXiv)
MurrayL@lemmy.world 4 weeks ago
Right. There’s plenty to criticise Apple for, both in general and for chasing the AI trend, but looking at it purely in terms of user privacy within AI features they’re miles ahead of the competition.
hobovision@lemm.ee 4 weeks ago
I had scanned through it, and it looked like the exact same stuff that Google and Microsoft say. Paraphrasing: “we value your privacy” “we’re de-identifying your data” “the processing occurs on-device”…
Apple probably is better on privacy than other big tech corpos, but it’s a race to the bottom, and they’re definitely participating in the race.
deleted@lemmy.world 4 weeks ago
To be honest, it’s important to the point it should be in the title since privacy is the selling point for apple.
huppakee@lemm.ee 4 weeks ago
Yes they have said so themselves
Viri4thus@feddit.org 4 weeks ago
Oh look, it’s the second shoe dropping.
fubarx@lemmy.world 4 weeks ago
Was working on a simulator and needed random interaction data. Statistical randomness didn’t capture likely scenarios (bell curves and all that). Switched to LLM synthetic data generation. Seemed better, but wait… seemed off 🤔. Checked it for clustering and entropy vs human data. JFC. Waaaaaay off.
Lesson: synthetic data for training is a Bad Idea. There are no shortcuts. Humans are lovely and messy.
plz1@lemmy.world 4 weeks ago
It would be nice if they actually fixed the stability issues in Apple Intelligence before they start adding more layers of slop to it. Writing tools summarization has been broken off and on since it launched.
LordCrom@lemmy.world 4 weeks ago
Holy crap , this is really intrusive. It’s opt in, but who would opt in to this harvesting at all?
Eggyhead@lemmings.world 4 weeks ago
Opt in means they’re building up the infrastructure to make it opt-out when nobody is looking.
taladar@sh.itjust.works 3 weeks ago
And then “accidentally” lose the opt-out setting every other update or so.
CompactFlax@discuss.tchncs.de 4 weeks ago
Ben Thompson has been saying that they need to collect user data (like google) for a decade.
It seems the botched Apple Intelligence release changed some minds, a little bit.
Salvo@aussie.zone 4 weeks ago
That still doesn’t give them the right to mine the data that their users entrusted to them though a paid service.
It doesn’t matter how anonymised their harvesting is, they had an agreement with their subscribers not to invade their privacy like this.
We are better off with a LLM that doesn’t work than abusing the data entrusted to them by their users.
It won’t be long until the LLM bubble bursts and we all laugh about how stupid we were to think they had any use whatsoever.
CompactFlax@discuss.tchncs.de 4 weeks ago
I guess you didn’t see the several points in the article where they make it clear that it is “opt in”?
I do look forward for the bursting of the LLM bubble, but the article isn’t just about LLM.
Ledericas@lemm.ee 3 weeks ago
They already admitted they aren’t generating profit from it