Alibaba Cloud says it cut Nvidia AI GPU use by 82% with new pooling system— up to 9x increase in output lets 213 GPUs perform like 1,192
Submitted 8 hours ago by cm0002@lemmings.world to technology@lemmy.zip
Submitted 8 hours ago by cm0002@lemmings.world to technology@lemmy.zip
Rentlar@lemmy.ca 7 hours ago
It should be noted, how much will that affect the lifespan of those GPUs running double-dutyx8?
AI’s still replaceable but it will emulate human-like burnout.
BCOVertigo@lemmy.world 6 hours ago
From their linked study:
“Filling this utilization gap requires us to better saturate each GPU by enabling it to serve requests from multiple models. As such, we aim to conduct effective GPU pooling. By sharing a GPU between as many models as possible with- out violating the service-level objective (SLO), GPU pooling promises great reductions in operational expenses (OPEX) for concurrent LLM serving.”
The “saturate each gpu” part seems to support your idea.
Rentlar@lemmy.ca 4 hours ago
I do expect operational savings from this optimization, but my guesstimate would be a 2-5x savings rather than the reported 9x savings when looked at over a fixed time period.