It’s fully open source and free (as in beer).
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
mitexleo@buddyverse.one 1 day ago
You should try cherry-ai.com … It’s the most advanced client out there. I personally use Ollama for running the models and Mistral API for advnaced tasks.
mitexleo@buddyverse.one 1 day ago
catty@lemmy.world 1 day ago
But its website is Chinese. Also what’s the github?
happinessattack@lemmy.world 17 hours ago
github.com/CherryHQ/cherry-studio