melfie@lemy.lol 4 days ago
I have no interest in using it, but at least it’s MIT licensed, which puts it ahead of Microslop’s rubbish if nothing else.
melfie@lemy.lol 4 days ago
I have no interest in using it, but at least it’s MIT licensed, which puts it ahead of Microslop’s rubbish if nothing else.
elvith@feddit.org 4 days ago
Yeah, but if I understand that correctly, that’s just for the app itself the LLM is very likely still a proprietary one (ChatGPT, Grok,…)
melfie@lemy.lol 4 days ago
Looks like it supports locally hosted models as well, such as via Ollama: docs.openclaw.ai/providers.
Jrockwar@feddit.uk 4 days ago
The LLM is whatever you want it to be. Self hosted or from any provider with a compatible endpoint. It’s likely a proprietary one… Because the cost of training LLMs means most are proprietary ones.