Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
mitexleo@buddyverse.one 2 weeks ago
You should try cherry-ai.com … It’s the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.