Comment on Google quietly released an app that lets you download and run AI models locally
Euphoma@lemmy.ml 4 days agoYou can use it in termux
Comment on Google quietly released an app that lets you download and run AI models locally
Euphoma@lemmy.ml 4 days agoYou can use it in termux
Greg@lemmy.ca 4 days ago
Has this actually been done? If so, I assume it would only be able to use the CPU
Euphoma@lemmy.ml 3 days ago
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk