Comment on Smaug-72B-v0.1: The New Open-Source LLM Roaring to the Top of the Leaderboard
ivanafterall@kbin.social 9 months agoSo can I run it on my Radeon RX 5700? I overclocked it some and am running it as a 5700 XT, if that helps.
Comment on Smaug-72B-v0.1: The New Open-Source LLM Roaring to the Top of the Leaderboard
ivanafterall@kbin.social 9 months agoSo can I run it on my Radeon RX 5700? I overclocked it some and am running it as a 5700 XT, if that helps.
L_Acacia@lemmy.one 9 months ago
To run this model locally at gpt4 writing speed you need at least 2 x 3090 or 2 x 7900xtx. VRAM is the limiting factor in 99% of cases for interference. You could try a smaller model like mistral-instruct or SOLAR with your hardware though.