Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

<- View Parent
O_R_I_O_N@lemm.ee ⁨2⁩ ⁨weeks⁩ ago

ChainLit is a super ez UI too. Ollama works well with Semantic Kernal (for integration with existing code) and langChain (for agent orchestration). I’m working on building MCP interaction with ComfyUI’s API, it’s a pain on the ass.

source
Sort:hotnewtop