Comment on [deleted]
mushroommunk@lemmy.today 2 weeks agoGuardrails only go so far when the underlying system is inherently flawed. AI is inherently flawed. A hallucination itself could kill someone vulnerable and you can’t prompt your way out of that.