Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
catty@lemmy.world 16 hours agoBut won’t this be a mish-mash of different docker containers and projects?
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
catty@lemmy.world 16 hours agoBut won’t this be a mish-mash of different docker containers and projects?
andrew0@lemmy.dbzer0.com 8 hours ago
All the ones I mentioned can be installed with pip or uv if I am not mistaken. It would probably be more finicky than containers that you can put behind a reverse proxy, but it is possible if you wish to go that route. Ollama will also run system-wide, so any project will be able to use its API without you having to create a separate environment and download the same model twice in order to use it.