Comment on Consumer GPUs to run LLMs

<- View Parent
swelter_spark@reddthat.com ⁨2⁩ ⁨weeks⁩ ago

With Ollama, all you have do is copy an extra folder of ROCm files. Not hard at all.

source
Sort:hotnewtop