Comment on How to use GPUs over multiple computers for local AI?
marauding_gibberish142@lemmy.dbzer0.com 1 week agoThanks but I’m not going to run supercomputers. I just want to run 4 GPUs separately because of inadequate PCIe lanes in a single computer to run 24B-30B models
vane@lemmy.world 1 week ago
I believe you can run 30B models on used rtx 3090 24GB at least I run 30B deepseek-r1 on it using ollama.
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
Heavily quantized?
vane@lemmy.world 1 week ago
I run this one. ollama.com/…/deepseek-r1:32b-qwen-distill-q4_K_M with this frontend github.com/open-webui/open-webui on single rtx 3090 hardware 64gb ram. It works quite well for what I wanted it to do.
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
I see. Thanks