Comment on Researchers have found the cause of hallucinations in LLMs, H-Neurons: On the Existence, Impact, and Origin of Hallucination-Associated Neurons in LLMs

<- View Parent
snooggums@piefed.world ⁨1⁩ ⁨day⁩ ago

Prioritizing conversational compliance over factual integrity when the output is promoted as being factual is a design flaw.

Saying double check the output does not excuse that flaw when LLM CEOS say their models are like someone with a PhD or that it can automate every white collar job within a year.

source
Sort:hotnewtop