Comment on We have to stop ignoring AI’s hallucination problem
Cyberflunk@lemmy.world 5 months agoHallucinations, like depression, is a multifaceted issue. Training data is only a piece of it. Quantized models, overfitted training models rely on memory at the cost of obviously correct training data. Poorly structured Inferences can confuse a model.
Rest assured, this isn’t just training data.
KillingTimeItself@lemmy.dbzer0.com 5 months ago
yeah there’s also this stuff as well, though i consider that to be a more technical challenge, rather than a hard limit.