Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution

<- View Parent
voidspace@lemmy.world ⁨1⁩ ⁨day⁩ ago

Took ages to produce answer, and only worked once on one model, then crashed since then.

source
Sort:hotnewtop