The LLM is whatever you want it to be. Self hosted or from any provider with a compatible endpoint. It’s likely a proprietary one… Because the cost of training LLMs means most are proprietary ones.
Yeah, but if I understand that correctly, that’s just for the app itself the LLM is very likely still a proprietary one (ChatGPT, Grok,…)
Jrockwar@feddit.uk 4 days ago
melfie@lemy.lol 4 days ago
Looks like it supports locally hosted models as well, such as via Ollama: docs.openclaw.ai/providers.