Comment on Smaug-72B-v0.1: The New Open-Source LLM Roaring to the Top of the Leaderboard

<- View Parent
cm0002@lemmy.world ⁨9⁩ ⁨months⁩ ago

Oh if only it were so simple lmao, you need ~130GB of VRAM, aka the graphics card RAM. So you would need about 9 consumer grade 16GB graphics cards and you’ll probably need Nvidia because of fucking CUDA so we’re talking about thousands of dollars. Probably approaching 10k

Ofc you can get cards with more VRAM per card, but not in the consumer segment so even more $$$$$$

source
Sort:hotnewtop