It’s fully open source and free (as in beer).
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
mitexleo@buddyverse.one 3 weeks ago
You should try cherry-ai.com … It’s the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.
mitexleo@buddyverse.one 3 weeks ago
catty@lemmy.world 3 weeks ago
But its website is Chinese. Also what’s the github?
happinessattack@lemmy.world 3 weeks ago
github.com/CherryHQ/cherry-studio