Comment on We have to stop ignoring AI’s hallucination problem

<- View Parent
dustyData@lemmy.world ⁨5⁩ ⁨months⁩ ago

Not really. Reality is mostly a social construction. If there’s not an other to check and bring about meaning, there is no reality, and therefore no hallucinations. More precisely, everything is a hallucination. As we cannot cross reference reality with LLMs and it cannot correct itself to conform to our reality.

I’m not conflating tokens with anything, I explicitly said they aren’t an internal representation. They’re state and nothing else. LLMs don’t have an internal representation of reality. And they probably can’t given their current way of working.

source
Sort:hotnewtop