Comment on Nvidia reveals new A.I. chip, says costs of running LLMs will ‘drop significantly’

<- View Parent
abhibeckert@lemmy.world ⁨1⁩ ⁨year⁩ ago

if I wanted to run, say, BLOOM (an open-source LLM), I’d need to spend close to $100K on hardware

Doesn’t that need 48 notes with over a terabyte of RAM each? And state of the art networking?

Sounds closer to $100M than $100K.

source
Sort:hotnewtop