You can record and edit videos on your own devices, but that doesn’t mean it’s suddenly free for Netflix or YouTube to stream their videos to you.
Surely a local version of Alexa could be developed, but that development would come with its own costs.
Some things simply can’t be done locally, such as a web search. Often your route calculations for a map application are also done in the cloud.
ours@lemmy.film 1 year ago
Having “AI functionality” doesn’t mean they can just get rid of their big/expensive models they use now.
If they are anything like Open AI’s LLM, it requires very beefy machines with a ton of expensive RAM.
Hazdaz@lemmy.world 1 year ago
Well that’s exactly what I was thinking when these companies were making these claims… like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn’t make sense.
ours@lemmy.film 1 year ago
“AI” doesn’t use databases per se, they are trained models built from large amounts of training data.
Some models run fine on small devices (like the model running on phones to make better pictures) but others are huge like Open AI’s LLM.
Hazdaz@lemmy.world 1 year ago
Wouldn’t that data be stored in some kind of database?
blazeknave@lemmy.world 1 year ago
You’re right. Run an llm locally adjacent to your application sandboxes and local user apps and your office will lower its heating bills.