Comment on What budget friendly GPU for local AI workloads should I aim for?
Lumisal@lemmy.world 1 week agoYes but you have to find the now kinda rare used NVLink ideally
Comment on What budget friendly GPU for local AI workloads should I aim for?
Lumisal@lemmy.world 1 week agoYes but you have to find the now kinda rare used NVLink ideally
MalReynolds@slrpnk.net 1 week ago
Nah, NVLink is irrelevant for inference workloads (inference nearly all happens in the cards, models are split up over multiple and tokens are piped over pcie as necessary), mildly useful for training but you’ll get there without them.