Comment on [deleted]

<- View Parent
mushroommunk@lemmy.today ⁨2⁩ ⁨weeks⁩ ago

Guardrails only go so far when the underlying system is inherently flawed. AI is inherently flawed. A hallucination itself could kill someone vulnerable and you can’t prompt your way out of that.

source
Sort:hotnewtop