Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

hendrik@palaver.p3x.de ⁨1⁩ ⁨day⁩ ago

Maybe LocalAI? It doesn't do python code execution, but pretty much all of the rest.

source
Sort:hotnewtop