Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
mitexleo@buddyverse.one 1 month agoLocalAI is pretty good but resource-intensive. I ran it on a vps in the past.
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
mitexleo@buddyverse.one 1 month agoLocalAI is pretty good but resource-intensive. I ran it on a vps in the past.