Comment on Google quietly released an app that lets you download and run AI models locally
Euphoma@lemmy.ml 5 weeks agoYou can use it in termux
Comment on Google quietly released an app that lets you download and run AI models locally
Euphoma@lemmy.ml 5 weeks agoYou can use it in termux
Greg@lemmy.ca 5 weeks ago
Has this actually been done? If so, I assume it would only be able to use the CPU
Euphoma@lemmy.ml 5 weeks ago
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk