Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

mitexleo@buddyverse.one ⁨1⁩ ⁨day⁩ ago

You should try cherry-ai.com … It’s the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.

source
Sort:hotnewtop