well that looks like small enterprise scale
Comment on How to use GPUs over multiple computers for local AI?
marauding_gibberish142@lemmy.dbzer0.com 1 week agoI’m not going to do anything enterprise. I’m not sure how people seem to think of it this way when I didn’t even mention it.
I plan to use 4 GPUs with 16-24GB VRAM each to run smaller 24B models.
WhyJiffie@sh.itjust.works 1 week ago
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
If you consider 4 B580s as enterprise, sure I guess
WhyJiffie@sh.itjust.works 1 week ago
not that, but 4 GPUs with 16-24GB VRAM I do
Xanza@lemm.ee 1 week ago
I’m not going to do anything enterprise.
You are, though. You’re creating a GPU cluster for generative AI which is an enterprise endeavor…
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
Specifically because PCIe slots go for a premium on motherboards and CPU architectures. If I didn’t have to worry about PCIe I wouldn’t care about a networked AI cluster. But yes, I accept what you say
False@lemmy.world 1 week ago
I didn’t say you were, I said you were asking about a topic that enters that area.
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
I see. Thanks