Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

<- View Parent
mitexleo@buddyverse.one ⁨2⁩ ⁨weeks⁩ ago

LocalAI is pretty good but resource-intensive. I ran it on a vps in the past.

source
Sort:hotnewtop