Comment on Selfhost an LLM

ikidd@lemmy.world ⁨3⁩ ⁨days⁩ ago

OpenWebUI is pretty much exactly what you’re looking for. It can start up an ollama instance that you can use for your other applications over the network, and chat with it as you see fit. If you have an API key from an outside subscription like OpenRouter or Anthropic, you can enter it and use the models avaialable there if the local ones you’ve downloaded aren’t up to the task.

source
Sort:hotnewtop