Comment on Selfhost an LLM

mike_wooskey@lemmy.thewooskeys.com ⁨1⁩ ⁨week⁩ ago

Sounds like you already know what you nerd to know to host Ollama in a docker container. Ollama is an LLM “engine” - you can interact with LLM models via a CLI or you can integrate them into other services via an API.

To have a web page chat like ChatGPT or others, I installed OpenWebU. I love it! A friend of mine likes LMStudio, which i think is a desktop app, but I don’t know anything about it.

source
Sort:hotnewtop