Comment on Consumer GPUs to run LLMs

<- View Parent
marauding_gibberish142@lemmy.dbzer0.com ⁨2⁩ ⁨months⁩ ago

In general how much VRAM do I need for 14B and 24B models?

source
Sort:hotnewtop