Comment on Google quietly released an app that lets you download and run AI models locally
Greg@lemmy.ca 6 days agoHas this actually been done? If so, I assume it would only be able to use the CPU
Comment on Google quietly released an app that lets you download and run AI models locally
Greg@lemmy.ca 6 days agoHas this actually been done? If so, I assume it would only be able to use the CPU
Euphoma@lemmy.ml 5 days ago
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk