Comment on Uses for local AI?
RandomLegend@lemmy.dbzer0.com 3 months agoWhat on earth is going on with your keyboad?!
Besides that, i have 20GB of VRAM and 64GB or RAM. I can run the mixtral 8x7b model relatively usable. Currently i use oobabooga the most.
WeLoveCastingSpellz@lemmy.dbzer0.com 3 months ago
I tupe very poorly on my phone. with that much vram ypu csn get somethşng lşke a 70b model defineyly ask around in the koboldai community that shşt’s crszy