Comment on We have to stop ignoring AI’s hallucination problem

<- View Parent
UnpluggedFridge@lemmy.world ⁨5⁩ ⁨months⁩ ago

How do hallucinations preclude an internal representation? Couldn’t hallucinations arise from a consistent internal representation that is not fully aligned with reality?

I think you are misunderstanding the role of tokens in LLMs and conflating them with internal representation. Tokens are used to generate a state, similar to external stimuli. The internal representation, assuming there is one, is the manner in which the tokens are processed. You could say the same thing about human minds, that the representation is not located anywhere like a piece of data; it is the manner in which we process stimuli.

source
Sort:hotnewtop