Took ages to produce answer, and only worked once on one model, then crashed since then.
Comment on What can I use for an offline, selfhosted LLM client, pref with images,charts, python code execution
catty@lemmy.world 1 day ago
I’ve discovered jan.ai which is far faster than GPT4All, and visually a little nicer
otacon239@lemmy.world 1 day ago
I also started using this recently and it’s very plot and play. Just open and run. It’s the only client so far that feels like I could recommend to non-geeks.
catty@lemmy.world 1 day ago
I agree. it looks nice, explains the models fairly well. Hides away the model settings nicely, and even recommends some initial models to get started that have low requirements. I like the concept of plugins but haven’t found a way to e.g. run python code it creates yet and display the output in the window