Comment on Intent recognition for HomeAssistant without an LLM?
smiletolerantly@awful.systems 4 days agoThanks, had not heard of this before! From skimming the link, it seems that the integration with HASS mostly focuses on providing wyoming endpoints (STT, TTS, wakeword), right? (Un)fortunately, that’s the part that’s already working really well 😄
However, the idea of just writing a stand-alone application with Ollama-compatible endpoints, but not actually putting an LLM behind it is, genius, I had not thought about that. That could really simplify stuff if I decide to write a custom intent handler. SO, yeah, thanks for the link!!
Canuck@sh.itjust.works 4 days ago
You can connect Ollama and cloud providers like ChatGPT into OVOS/Neon, so that when you ask questions it doesn’t know how to handle, it can respond using the LLM.
smiletolerantly@awful.systems 4 days ago
Please read the title of the post again. I do not want to use an LLM. Selfhosted is bad enough, but feeding my data to OpenAI is worse.