Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
andrew0@lemmy.dbzer0.com 6 days agoAll the ones I mentioned can be installed with pip or uv if I am not mistaken. It would probably be more finicky than containers that you can put behind a reverse proxy, but it is possible if you wish to go that route. Ollama will also run system-wide, so any project will be able to use its API without you having to create a separate environment and download the same model twice in order to use it.