Comment on Very large amounts of gaming gpus vs AI gpus

BrightCandle@lemmy.world ⁨1⁩ ⁨day⁩ ago

Initially a lot of the AI was getting trained on lower class GPUs and none of these AI special cards/blades existed. The problem is that the problems are actually quite large and hence require a lot of VRAM to work on or you split it and pay enormous latency penalties going across the network. Putting it all into one giant package costs a lot more but it also performs a lot better, because AI is not an embarrassingly parallel problem that can be easily split across many GPUs without penalty.

source
Sort:hotnewtop