You bet your ollama I am.
Comment on OpenAI's GPT Trademark Request Has Been Denied
XEAL@lemm.ee 9 months agoBTW, are you running that locally?
Dalraz@lemmy.ca 9 months ago
UraniumBlazer@lemm.ee 9 months ago
Naah. I think this model needs a crazy amount of vram to run. I’m stuck with 4gigs :(
neutron@thelemmy.club 9 months ago
Did you use a specific website to use Mixtral? I want to try but system requirements are crazy.
UraniumBlazer@lemm.ee 9 months ago
huggingface.co/chat
QuadratureSurfer@lemmy.world 9 months ago
You can run it locally with an RTX 3090 or less (as long as you have enough RAM), but there’s a bit of a tradeoff in speed when using more system RAM vs VRAM.
L_Acacia@lemmy.one 9 months ago
If you have good enough hardware, this is a rabbithole you could explore. github.com/oobabooga/text-generation-webui/