Comment on Researchers have found the cause of hallucinations in LLMs, H-Neurons: On the Existence, Impact, and Origin of Hallucination-Associated Neurons in LLMs

<- View Parent
snooggums@piefed.world ⁨1⁩ ⁨day⁩ ago

When the system is intended to look like a random a person then randomness is fine.

When the output is expected to be accurate, it should be the same each time so it can be verified as accurate.

LLMs are being sold as doing both at the same time, but random plus consistent equals random.

source
Sort:hotnewtop