Comment on How to use GPUs over multiple computers for local AI?

<- View Parent
marauding_gibberish142@lemmy.dbzer0.com ⁨1⁩ ⁨week⁩ ago

It is because modern consumer GPUs do not have enough VRAM to load the 24B models. I want to run Mistral small locally.

source
Sort:hotnewtop