Comment on We have to stop ignoring AI’s hallucination problem

<- View Parent
UnpluggedFridge@lemmy.world ⁨1⁩ ⁨month⁩ ago

We do not know how LLMs operate. Similar to our own minds, we understand some primitives, but we have no idea how certain phenomenon emerge from those primitives. Your assertion would be like saying we understand consciousness because we know the structure of a neuron.

source
Sort:hotnewtop