Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
voidspace@lemmy.world 2 months agoTook ages to produce answer, and only worked once on one model, then crashed since then.
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
voidspace@lemmy.world 2 months agoTook ages to produce answer, and only worked once on one model, then crashed since then.
catty@lemmy.world 2 months ago
Try the beta on the github repo