Comment on Consumer GPUs to run LLMs

<- View Parent
RagingHungryPanda@lemm.ee ⁨5⁩ ⁨days⁩ ago

I haven’t tried those, so not really, but with open web UI, you can download and run anything, just make sure it fits in your vram so it doesn’t run on the CPU. The deep seek one is decent. I find that i like chatgpt 4-o better, but it’s still good.

source
Sort:hotnewtop