Comment on Ollama not using AMD GPU on Arch Linux
possiblylinux127@lemmy.zip 3 weeks ago
I would run it in a Podman container with the GPU passed though
Comment on Ollama not using AMD GPU on Arch Linux
possiblylinux127@lemmy.zip 3 weeks ago
I would run it in a Podman container with the GPU passed though
30p87@feddit.org 3 weeks ago
Why not throw that into a VM with VFIO passthrough, plug the GPU in via an external dock and if we are already at abstracting shit away for unnecessary complexity and non-compatibility do all that on windows?
possiblylinux127@lemmy.zip 3 weeks ago
Because that is way more complicated?
It is really easy to run ollama in a container.
30p87@feddit.org 3 weeks ago
Really easy to start running it
Then everything goes wrong, from configuration over logs to cuda. And the worst fucking debugging ever.
possiblylinux127@lemmy.zip 3 weeks ago
On Linux you can download Alpaca. I think it is CPU only but it is simpler.
exu@feditown.com 3 weeks ago
Nested VMs stay performant about three levels deep, so do that as well.