Comment on Consumer GPUs to run LLMs

<- View Parent
marauding_gibberish142@lemmy.dbzer0.com ⁨1⁩ ⁨month⁩ ago

Thank you. Are 14B models the biggest you can run comfortably?

source
Sort:hotnewtop