Comment on Intent recognition for HomeAssistant without an LLM?
JoeyJoeJoeJr@lemmy.ml 5 days ago
I don’t have as much experience with HASS, but I did use Mycroft for quite a while (stopped only because I had multiple big moves, and ended up in a place small enough voice control didn’t really make sense any more). There were a few intent parsers used with/made for that:
github.com/MycroftAI/adapt github.com/MycroftAI/padatious github.com/MycroftAI/padaos
In my experience, Adapt was far and away the most reliable. If you go the route of rolling your own solution, I’d recommend checking that out, and using the absolute minimum number of words to design your intents. E.g. require “off” and an entity, and nothing else, so that “AC off,” “turn off the AC,” and “turn the AC off” all work. This reduces the number of words your STT has to transcribe correctly, and allows flexibility in command phrasing.
If you borrow a little more from Mycroft, they had “fallback” skills that were triggered when an intent couldn’t be matched. You could use the same idea, and use github.com/seatgeek/thefuzz to fuzzy match entities and keywords, to try to handle remaining cases where STT fails. I believe that is what this community made skill attempted to do: github.com/MycroftAI/skill-homeassistant (I think there were more than one HASS skill implementations, so I could be conflating this with another).
Another comment mentioned OVOS/Neon - those forked off of Mycroft, so you may see overlap if you investigate those as well.
smiletolerantly@awful.systems 3 days ago
Thanks for the recommendation! That looks interesting indeed.
This entire topic is probably a sinkhole of complexity. It’s great to have somewhere to look for inspiration!