ChainLit is a super ez UI too. Ollama works well with Semantic Kernal (for integration with existing code) and langChain (for agent orchestration). I’m working on building MCP interaction with ComfyUI’s API, it’s a pain on the ass.
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
andrew0@lemmy.dbzer0.com 2 days ago
Ollama for API, which you can integrate into Open WebUI. You can also integrate image generation with ComfyUI I believe.
It’s less of a hassle to use Docker for Open WebUI, but ollama works as a regular CLI tool.
O_R_I_O_N@lemm.ee 2 days ago
muntedcrocodile@hilariouschaos.com 1 day ago
This is what I do its excellent.
catty@lemmy.world 1 day ago
But won’t this be a mish-mash of different docker containers and projects?
andrew0@lemmy.dbzer0.com 16 hours ago
All the ones I mentioned can be installed with pip or uv if I am not mistaken. It would probably be more finicky than containers that you can put behind a reverse proxy, but it is possible if you wish to go that route. Ollama will also run system-wide, so any project will be able to use its API without you having to create a separate environment and download the same model twice in order to use it.