Comment on How to use GPUs over multiple computers for local AI?
just_another_person@lemmy.world 1 week agoI assume your talking about a CUDA implementation here. There’s ways to do this with that system, and even sub-projects that expand on that. I’m mostly pointing how pointless it is for you to do this. What a waste of time and money.
marauding_gibberish142@lemmy.dbzer0.com 1 week ago
Used 3090s go for $800. I was planning to wait for the ARC B580s to go down in price to buy a few. The reason for the networked setup is because I didn’t find there to be enough PCIe lanes in any of the used computers I was looking at. If there’s either an affordable card with good performance and 48GB of VRAM, or there’s an affordable motherboard + CPU combo with a lot of PCIe lanes under $200, then I’ll gladly drop the idea of the distributed AI. I just need lots of VRAM and this is the only way I could think of.
Thanks
just_another_person@lemmy.world 1 week ago
PLEASE look back at the crypto mining rush of a decade ago. I implore you.
You’re buying into something that doesn’t exist.