I don’t think there is an LLM in this application. Not all AI tools involve LLM.
Comment on Taco Bell rethinks AI drive-through after man orders 18,000 waters
ch00f@lemmy.world 2 weeks agoSo the output from the LLM is just a text description that’s fed into another, smarter piece of software that interprets that text into an order? What task is the LLM actually doing in this case?
Vanth@reddthat.com 2 weeks ago
Dashi@lemmy.world 2 weeks ago
The LLM is taking the order. Interpreting what people say into that simple text description. Not everyone talks the same or describes things the same. That is i believe where the bulk of the LLM is doing the work. Then I’m sure there is some background stock management and health checks out manages as well
TheBat@lemmy.world 2 weeks ago
What’s wrong with an input machine with buttons or touch screen?
deegeese@sopuli.xyz 2 weeks ago
Takes too long to hold down the button for 18,000 waters.
spankmonkey@lemmy.world 2 weeks ago
Not futuristic enough or something.
Dashi@lemmy.world 2 weeks ago
They are not able to answer questions or change simply via a software update.
parody@lemmings.world 2 weeks ago
OT4G
(Order Time For Grandma)
Tollana1234567@lemmy.today 2 weeks ago
not as hype-able to the csuites and ceos.
Serinus@lemmy.world 2 weeks ago
We have apps for that, and they’re typically a pita. They certainly take longer than just talking through your order.
pirat@lemmy.world 2 weeks ago
Yeah, unlike a human that understands a customer saying “one pizzaburger, that’s all”, the app doesn’t understand the situation that the order is complete, but rather just keeps on asking more obviously unwanted cringey questions like “buy two, you’ll save a few cents on the second one?” or “what will you drink with that?” or “is that a big menu?”…