Comment on How to use GPUs over multiple computers for local AI?
vane@lemmy.world 1 week agoI believe you can run 30B models on used rtx 3090 24GB at least I run 30B deepseek-r1 on it using ollama.
Comment on How to use GPUs over multiple computers for local AI?
vane@lemmy.world 1 week agoI believe you can run 30B models on used rtx 3090 24GB at least I run 30B deepseek-r1 on it using ollama.
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
Heavily quantized?
vane@lemmy.world 1 week ago
I run this one. ollama.com/…/deepseek-r1:32b-qwen-distill-q4_K_M with this frontend github.com/open-webui/open-webui on single rtx 3090 hardware 64gb ram. It works quite well for what I wanted it to do.
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
I see. Thanks