Comment on Self-Hosted AI is pretty darn cool

<- View Parent
theterrasque@infosec.pub ⁨3⁩ ⁨months⁩ ago

Llama3 8b can be run at 6gb vram, and it’s fairly competent. Gemma has a 9b I think, which would also be worth looking into.

source
Sort:hotnewtop