Comment on We have to stop ignoring AI’s hallucination problem

<- View Parent
Cyberflunk@lemmy.world ⁨5⁩ ⁨months⁩ ago

Hallucinations, like depression, is a multifaceted issue. Training data is only a piece of it. Quantized models, overfitted training models rely on memory at the cost of obviously correct training data. Poorly structured Inferences can confuse a model.

Rest assured, this isn’t just training data.

source
Sort:hotnewtop