Less horrifying conceptually, but in Canada a major airline tried to replace their support services with a chatbot. The chatbot then invented discounts that didn’t actually exist, and the courts ruled that the airline had to honour them. The chatbot was, for all intents and purposes, no more or less official a source of data than any other information they put out, such as their website and other documentation.
Comment on We have to stop ignoring AI’s hallucination problem
14th_cylon@lemm.ee 5 months agoPeople who suggest, let’s say, firing employees of crisis intervention hotline and replacing them with llms…
Voroxpete@sh.itjust.works 5 months ago
L_Acacia@lemmy.one 5 months ago
They know the tech is not good enough, they just dont care and want to maximise profit.
SkyezOpen@lemmy.world 5 months ago
“Have you considered doing a flip as you leap off the building? That way your death is super memorable and cool, even if your life wasn’t.”
-Crisis hotline LLM, probably.