Comment on Hype-fueling science fiction or plausible scenarios?

<- View Parent
taladar@sh.itjust.works ⁨4⁩ ⁨days⁩ ago

I am not talking about degrees of intelligence at all. Measuring LLMs in IQ makes no sense because they literally have no model of the world, all they do is reproduce language by statistically analyzing how those same words appeared in the input data. That is the reason for all those inconsistencies in their output, they literally have no understanding at all of what they are saying.

An LLM e.g. can’t tell that it is inconsistent to talk about someone losing their right arm in one paragraph and then talking about them performing an activity that requires both hands in the next.

source
Sort:hotnewtop