Comment on Mark Zuckerberg indicates Meta is spending billions of dollars on Nvidia AI chips

<- View Parent
31337@sh.itjust.works ⁨11⁩ ⁨months⁩ ago

The equivalent of 600k H100s seems pretty extreme though. IDK how many OpenAI has access to, but it’s estimated they “only” used 25k to train GPT4. OpenAI has, in the past, claimed the diminishing returns on just scaling their model past GPT4s size probably isn’t worth it. So, maybe Meta is planning on experimenting with new ANN architectures, or planning on mass deployment of models?

source
Sort:hotnewtop