Comment on Nvidia creates gaming-centric AI chatbot that runs on your GPU, locally.
WolfLink@sh.itjust.works 2 weeks ago
You can run your own LLM chatbot with ollama.com
They have some really small ones that only require like 1GB of VRAM, but you’ll generally get better results if you pick the biggest model that fits on your GPU.