Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
voidspace@lemmy.world 1 month agoTook ages to produce answer, and only worked once on one model, then crashed since then.
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
voidspace@lemmy.world 1 month agoTook ages to produce answer, and only worked once on one model, then crashed since then.
catty@lemmy.world 1 month ago
Try the beta on the github repo