Comment on Mind-reading AI can translate brainwaves into written text: Using only a sensor-filled helmet combined with artificial intelligence, a team of scientists has announced they can turn a person’s thou...

<- View Parent
Not_mikey@lemmy.world ⁨1⁩ ⁨year⁩ ago

Auto complete is not a lossy encoding of a database either, it’s a product of a dataset, just like you are a product of your experiences, but it is not wholly representative of that dataset.

A wind tunnel is not intelligent because it doesn’t answer questions or process knowledge/data it just creates data. A wind tunnel will not answer the question “is this aerodynamic” but you can observe a wind tunnel and use your intelligence to process that and answer the question.

Temperature and randomness don’t explain hallucinations, they are a product of inference. If you turned the temperature down to 0 and asked it the question " what happened in the great Christmas fire of 1934" it will give it’s best guess of what happened then even though that question is not in it’s dataset and it can’t look up the answer. The temperature would just mean that between runs it would consistently give the same story, the one that is most statistically probable, as opposed to another one that may be less probable but was pushed up due to randomness. Hallucinations are a product of inference, of taking something at face value then trying to explain it. People will do this too, if you tell someone a lie confidently then ask them about it they will use there intelligence to rationalize a story about what happened.

source
Sort:hotnewtop