Comment on How to use GPUs over multiple computers for local AI?

<- View Parent
vane@lemmy.world ⁨1⁩ ⁨week⁩ ago

I believe you can run 30B models on used rtx 3090 24GB at least I run 30B deepseek-r1 on it using ollama.

source
Sort:hotnewtop