Comment on How to use GPUs over multiple computers for local AI?

<- View Parent
marauding_gibberish142@lemmy.dbzer0.com ⁨1⁩ ⁨week⁩ ago

I’m not going to do anything enterprise. I’m not sure how people seem to think of it this way when I didn’t even mention it.

I plan to use 4 GPUs with 16-24GB VRAM each to run smaller 24B models.

source
Sort:hotnewtop