This gives google all the data
Apple is reportedly exploring a partnership with Google for Gemini-powered feature on iPhones
Submitted 7 months ago by return2ozma@lemmy.world to technology@lemmy.world
Comments
50MYT@aussie.zone 7 months ago
Zrybew@lemmy.world 7 months ago
So, Apple is lagging on Ai
AA5B@lemmy.world 7 months ago
All speculation, but this particular part of AI, maybe. It’s like claiming Toyota is lagging on cars because they refused to recognize battery electric vehicles as a viable option. No, they do very well with many areas of car manufacturing, but not BEVs
helpImTrappedOnline@lemmy.world 7 months ago
How about they colabortate on a messegaing standard instead?
rageagainstmachines@lemmy.world 7 months ago
No Big Tech company has your best interests in mind. Ever.
anon_8675309@lemmy.world 7 months ago
What does apple do with all that money???
Good grief.
AProfessional@lemmy.world 7 months ago
They make number go up, stock buybacks.
AliasAKA@lemmy.world 7 months ago
This is a really bad look. It will probably be the case that it will be an opt in feature, and maybe Apple negotiates that Google gives them a model they house on premises and don’t send any data back on, but it’s getting very hard for Apple here to claim privacy and protection (and not that they do a particularly good job of that unless you stop all their telemetry).
If an LLM is gonna be on a phone, it needs to be local. Local is really hard because the models are huge (even with quantization and other tricks). So this seems incredibly unlikely. Then it’s just “who do you trust to sell your data for ads more, Apple or Google?” To which I say neither, and pray Linux phones take off (yes yes I know root an Android and de google it but still).
abhibeckert@lemmy.world 7 months ago
I don’t see how it’s any different to using Safari as the default search.
Also - phones don’t have terabytes of RAM. The idea that a (good) LLM can run on a phone is ridiculous. Yes, you can run small AI models on there - but they’re not very good models.
AliasAKA@lemmy.world 7 months ago
It may be no different than using Google as the search engine on safari, assuming I get an opt out. If it’s used for Siri interactions then that gets extremely tricky for one to verify that your interactions aren’t being used to inform adds and or train an LLM. Much harder to opt out vs default search engine there, perhaps.
LLMs do not need terabytes of ram. Heck you can run quantized 7billion param models on 16gb or less (Bloom, Falcon7B — falcon outperforms models with higher memory by the way, so there’s room here for optimization). While not quite as good as openAIs offerings, they’re still quite good. There are Android phones with 24gb of ram so it’s quite possible for Apple to release an iPhone pro with that much, and run it similar to running any large language model on an M1 or M2 Mac. Hell you could probably fit an inference only model in less. Performance wouldn’t be blazing but depending on the task, it could absolutely be sufficient. With Apple MLX and Ferret coming online it’s totally possible that you could, basically today, have a reasonable LLM running on an iPhone 15 Pro. People run OpenHermes 7B for example which uses ~4.4GB to run, without those frameworks. Battery life does take a major hit, but to be honest I’m at a loss for what I need an LLM for on my phone anyways.
Regardless, I want a local LLM or none at all.
malloc@lemmy.world 7 months ago
Apple partnering with a sub standard LLM / ChatGPT clone. Classic. Tim Cook needs to go.
autotldr@lemmings.world [bot] 7 months ago
This is the best summary I could come up with:
Apple is looking to team up with Google for a mega-deal to leverage the Gemini AI model for features on iPhone, Bloomberg reported.
This will put Google in a commanding position as the company already has a deal with Apple as the preferred search engine provider on iPhones for the Safari browser.
The publication cited people familiar with the matter saying that Apple is looking to license Google’s AI tech to introduce AI-powered features with iOS updates later this year.
The company’s job listings over last year have suggested that Apple is working on multiple internal and external tools powered by generative AI.
Apple’s own models might power some of the on-device features on the upcoming iOS 18 software update — expected to be announced at the Worldwide Developer Conference (WWDC) historically held in June.
However, the company is exploring partnering with an external provider for generative AI use cases such as image creation and helping users with writing.
The original article contains 372 words, the summary contains 159 words. Saved 57%. I’m a bot and I’m open source!
baatliwala@lemmy.world 7 months ago
Both will probably use this as a point in a future anti competitive case against them.
AliasAKA@lemmy.world 7 months ago
This should actually work against them. It would be more like “See, we’re not interested in competing, we’d rather maintain monopolies and cartel it up!”
JRepin@lemmy.ml 7 months ago
No wonder. all GAFAM is a spyware surveillance capitalism mafia and they work together. If you really want to THINK different you need to look into libre and opensource software like GNU/Linux and the likes.
Fubarberry@sopuli.xyz 7 months ago
I hadn’t really even considered that apple wouldn’t be working on their own LLM. Seems like everyone is making their own LLM these days.
realharo@lemm.ee 7 months ago
They are, it’s just not very good (yet?) …substack.com/…/apple-is-working-on-multimodal-ai
Remember the early days of Apple Maps?
TheRealKuni@lemmy.world 7 months ago
If that’s an indication, Apple’s AI offerings will someday be as good or better than Google’s. Cause Apple Maps is pretty great these days, but was absolute garbage when they rolled it out.
abhibeckert@lemmy.world 7 months ago
Apple is working on models, but they seem to be focusing on ones that use tens of gigabytes of RAM, compared to tens of terabytes.
I wouldn’t be surprised Apple ships an “iPhone Pro” with 32GB of RAM dedicated to AI models. You can do a lot of really useful stuff with a model like that… but it can’t compete with GPT4 or Gemini.
Fubarberry@sopuli.xyz 7 months ago
Tbf, Google has versions of Gemini that will run locally on phones too, and their open source Gemini models run on 16GB of ram or so.