brucethemoose@lemmy.world 3 weeks ago
Kinda odd. 8 GPUs to a CPU is pretty much standard, and less ‘wasteful,’ as the CPU ideally shouldn’t do much for ML workloads.
LPCAMM is sick though. So is the sheer compatness of this thing; I bet HPC folks will love it.
Badabinski@kbin.earth 3 weeks ago
Yeah, 88/2 is weird as shit. Perhaps the GPUs are especially large? I know NVIDIA has that thing where you can slice up a GPU into smaller units (I can't remember what it's called, it's some fuckass TLA), so maybe they're counting on people doing that.
brucethemoose@lemmy.world 3 weeks ago
They could be ‘doubled up’ under the heatspreader, yeah, so kinda 4x GPUs to a CPU.
And yeah… perhaps they’re maintaining CPU ‘parity’ with 2P EPYC for slicing it up into instances.