Comment on Mark Zuckerberg indicates Meta is spending billions of dollars on Nvidia AI chips

<- View Parent
qupada@kbin.social ⁨5⁩ ⁨months⁩ ago

The estimated training time for GPT-4 is 90 days though.

Assuming you could scale that linearly with the amount of hardware, you'd get it down to about 3.5 days. From four times a year to twice a week.

If you're scrambling to get ahead of the competition, being able to iterate that quickly could very much be worth the money.

source
Sort:hotnewtop