You can also just tell your favorite one to do that, if that’s what you’re after or have a really bad GPU.
LM studio is the most stable and user friendly that I’ve found by far, but try to download a model that fits inside your GPU or else the model will be super slow or crash.
Quazatron@lemmy.world 11 months ago
Lookup Alpaca and Ollama. If you are using Linux they are just a Flatpak away.
If not, you can go with Ollama in docker format with a Open-WebUI frontend.
The model I used was Llama3.2 and basically told it to simulate GlaDOS.