Comment on Smaug-72B-v0.1: The New Open-Source LLM Roaring to the Top of the Leaderboard
rs137@lemmy.world 9 months agoLlama 2 70B with 8b quantization takes around 80GB VRAM if I remember correctly. I’ve tested it a while ago.
Comment on Smaug-72B-v0.1: The New Open-Source LLM Roaring to the Top of the Leaderboard
rs137@lemmy.world 9 months agoLlama 2 70B with 8b quantization takes around 80GB VRAM if I remember correctly. I’ve tested it a while ago.