Comment on Uses for local AI?
WeLoveCastingSpellz@lemmy.dbzer0.com 3 months agothe answer is vey spesific to ur pc and amount of vram you have availşble to you. But anything lama 3 even 8b models finetuned to DM or write stotied should theoritically work. The other reply that reccomends connecting to another program to make sure rules are consistent dounds like a great idea whşch I have not tried. I use silly tavern as the ui whşch has lots of options and shit to mske thşngs wkrk well. I would reccomend goşng şnto the “KoboldAI” discord and askşng şn the support sectşon folk there are very helpfull sorry for not beşng able to gşve a strsight answer. good luck!
RandomLegend@lemmy.dbzer0.com 3 months ago
What on earth is going on with your keyboad?!
Besides that, i have 20GB of VRAM and 64GB or RAM. I can run the mixtral 8x7b model relatively usable. Currently i use oobabooga the most.
WeLoveCastingSpellz@lemmy.dbzer0.com 3 months ago
I tupe very poorly on my phone. with that much vram ypu csn get somethşng lşke a 70b model defineyly ask around in the koboldai community that shşt’s crszy