Try PocketPal instead
Comment on Google quietly released an app that lets you download and run AI models locally
Greg@lemmy.ca 9 months agoOllama can’t run on Android
pirat@lemmy.world 9 months ago
Diplomjodler3@lemmy.world 9 months ago
Is there any useful model you can run on a phone?
Euphoma@lemmy.ml 9 months ago
You can use it in termux
Greg@lemmy.ca 9 months ago
Has this actually been done? If so, I assume it would only be able to use the CPU
Euphoma@lemmy.ml 9 months ago
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
AmbiguousProps@lemmy.today 9 months ago
That’s fair, but I think I’d rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
OhVenus_Baby@lemmy.ml 9 months ago
How is Ollama compared to GPT models? I used the paid tier for work and I’m curious how this stacks up.
AmbiguousProps@lemmy.today 9 months ago
It’s decent, with the deepseek model anyway. It’s not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.
Greg@lemmy.ca 9 months ago
Yes, that’s my setup. But this will be useful for cases where internet connection is not reliable
gens@programming.dev 9 months ago
Llama.cpp (on which ollama runs on) can. And many chat programs for phones can use it.