I’ve been playing with llama.cpp a bit for the last week and it’s surprisingly workable on a recent laptop just using the CPU. It’s not really hard to imagine Apple and others adding (more) AI accelerators on mobile.
Comment on Apple wants AI to run directly on its hardware instead of in the cloud
OhmsLawn@lemmy.world 1 year ago
How’s that supposed to work?
I’m picturing a backpack full of batteries and graphics cards. Maybe they’re talking about a more limited model?
exu@feditown.com 1 year ago
Apollo2323@lemmy.dbzer0.com 1 year ago
Oh yes and the CPUs on phones have being getting more powerful every year and there was nothing that could take advantage of their full potential now with a local AI will be great for privacy and response.
Ghostalmedia@lemmy.world 1 year ago
Google is already doing this with Gemini Nano. …google.com/…/pixel-feature-drop-december-2023
MiltownClowns@lemmy.world 1 year ago
They’re making their own silicone now. You can achieve a lot more efficiency when you’re streamlined the whole way through.
hips_and_nips@lemmy.world 1 year ago
silicone
It’s silicon. Silicon is a naturally occurring chemical element, whereas silicone is a synthetic substance.
Silicon is for computer chips, silicone is for boobies.
ImFresh3x@sh.itjust.works 1 year ago
By making their own, you mean telling Taiwan Semiconductor Manufacturing Company “hey we are going to buy enough of these units that you have to give us the specs we chose at a better price than the competitors, and since we chose the specs off your manufacturing capacity sheets we will say “engineered in Cupertino TM” “
Btw I’m not shitting on Apple here. I love my m2 processor.
MiltownClowns@lemmy.world 1 year ago
Sure, but being that pedantic is neither concise or pertinent to the question at hand.
ImFresh3x@sh.itjust.works 1 year ago
Apple is it making or engineering silicon.
eager_eagle@lemmy.world 1 year ago
Yes, like google is doing with their tensor chips
abhibeckert@lemmy.world 1 year ago
This is a Financial Times article, regurgitated by Ars Technica. The article isn’t by a tech journalist, it’s by a business journalist, and their definition of “AI” is a lot looser than what you’re thinking of.
I’m pretty sure they’re talking about things that Apple is already doing not just on current hardware but even on hardware from a few years ago. The algorithms don’t require anywhere near the type of power necessary for ChatGPT or even Stable Diffusion. For example the keyboard on iOS now uses pretty much the same technology as ChatGPT but scaled way way down to the point where “Tiny Language Model” would probably be more accurate.
Very cool technology - but it’s hardly “AI”.