Comment on Consumer GPUs to run LLMs

<- View Parent
marauding_gibberish142@lemmy.dbzer0.com ⁨1⁩ ⁨month⁩ ago

Do you have any recommendations for running the Mistral small model? I’m very interested in it alongside CodeLlama, OogaBooga and others

source
Sort:hotnewtop