Comment on What budget friendly GPU for local AI workloads should I aim for?
state_electrician@discuss.tchncs.de 1 week ago
I heard about people using multiple used 3090s in a single motherboard for this. Apparently it delivers a lot of bang for the buck, as compared to a single card with loads of VRAM.
Lumisal@lemmy.world 1 week ago
Yes but you have to find the now kinda rare used NVLink ideally
MalReynolds@slrpnk.net 1 week ago
Nah, NVLink is irrelevant for inference workloads (inference nearly all happens in the cards, models are split up over multiple and tokens are piped over pcie as necessary), mildly useful for training but you’ll get there without them.