Comment on Why doesn't Nvidia have more competition?
Glitchvid@lemmy.world 3 days agoExpounding, Nvidia has very deeply engrained itself in educational and research institutions. People learning GPU compute are being taught CUDA and Nvidia hardware. Researchers have access to farms of Nvidia chips.
AMD has basically gone the “build it and they will come” attitude, and the results to match.
brucethemoose@lemmy.world 3 days ago
Except they didn’t.
They repeatedly fumble the software with little mistakes (looking at you, Flash Attention). They price the MI300X and any high VRAM GPU through the roof, when they have every reason to be more competitive and undercut Nvidia. They have sad, incomplete software efforts divorced from what devs are actually doing, like their quantization framework or some inexplicably bad LLMs they trained themself.
They give no one any reason to give them a chance, and wonder why no one comes. Lisa Su could fix this with literally like two phone calls (remove VRAM restrictions on their OEMs, and fix stupid small bugs in ROCM), but they don’t.
Glitchvid@lemmy.world 3 days ago
That’s basically what I said in so many words. AMD is doing its own thing, if you want what Nvidia offers you’re gonna have to build it yourself. WRT pricing, I’m pretty sure AMD is typically a fraction of the price of Nvidia hardware on the enterprise side, from what I’ve read.
The biggest culprit from what I can gather is that AMD’s GPU side is basically still ATI camped up in Markham, divorced from the rest of the company in Austin that is doing great work with their CPU-side.
brucethemoose@lemmy.world 3 days ago
I’m not as sure about this, but seems like AMD is taking a fat margin on the MI300X (and its sucessor?), and kinda ignoring the performance penalty. It’s easy to say “build it yourself!” but the reality is very few can, or will, do this, and will simply try to deploy vllm or vanilla TRL or something as best they can (and run into the same issues everyone does).
The ‘enthusiast’ side where all the tinkerer devs reside is totally screwed up though. AMD’s mirroring Nvidia’s VRAM cartel pricing when they have absolutely no reason to. It’s completely bonkers. AMD would be in a totally different place right now if they had sold 40GB/48GB 7900s for an extra $100 or $200.
Yeah, it does seem divorced from the CPU division. But a lot of the badness comes from business decisions, even when the silicon is quite good, and some of that must be from Austin.
Glitchvid@lemmy.world 3 days ago
Eh, the biggest issue here is that most (post-secondary) students probably just have a laptop for whatever small GPGPU learning they’re doing, which is overwhelmingly dominated by Nvidia. For grad students they’ll have access to the institution resources, which is also dominated by Nvidia (this has been a concerted effort).
Only a few that explicitly pursue AMD hardware will end up with it, but that also requires significant foundational work for the effort. So the easiest path for research is throw students at CUDA and Nvidia hardware.
Basically, Nvidia has entrenched itself in the research/educational space, and that space is slow moving (Java is still the de facto CS standard, with only slow movements to Python happening at some universities), so I don’t see much changing, unless AMD decides it’s very hungry and wants to chase the market.