I think many people use it and it works. But sorry - no, I don't have any first-hand experience. I've tested it for a bit and it looked fine. Has a lot of features and it should be as efficient as any other ggml/llama.cpp based inference solution at least for text. I myself use KoboldCPP for the few things I do with AI and my computer is lacking a GPU so I don't really do a lot of images with software like this. And it's likely going to be less for you than the 15 minutes it takes me to generate an image on my unsuited machine.
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
catty@lemmy.world 1 day agoThis looks interesting - do you have experience of it? How reliable / efficient is it?
hendrik@palaver.p3x.de 1 day ago
mitexleo@buddyverse.one 23 hours ago
LocalAI is pretty good but resource-intensive. I ran it on a vps in the past.