Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

andrew0@lemmy.dbzer0.com ⁨2⁩ ⁨days⁩ ago

Ollama for API, which you can integrate into Open WebUI. You can also integrate image generation with ComfyUI I believe.

It’s less of a hassle to use Docker for Open WebUI, but ollama works as a regular CLI tool.

source
Sort:hotnewtop