Comment on I'm looking for an article showing that LLMs don't know how they work internally
theunknownmuncher@lemmy.world 6 days agoit’s completing the next word.
Facts disagree, but you’ve decided to live in a reality that matches your biases despite real evidence, so whatever 👍
glizzyguzzler@lemmy.blahaj.zone 6 days ago
It’s literally tokens. Doesn’t matter if it completes the next word or next phrase, still completing the next most likely token 😎😎 can’t think can’t reason can witch’s brew facsimile of something done before
Epp2@lemmynsfw.com 5 days ago
Why aren’t they tokens when you use them? Does your brain not also choose the most apt selection for the sequence to make maximal meaning in the context prompted? I assert that after a sufficiently complex obfuscation of the underlying mathematical calculations the concept of reasoning becomes an exercise in pedantic dissection of the mutual interpretation of meaning. Our own minds are objectively deterministic, but the obfuscation provided by lack of direct observation provides the quantum cover fire needed to claim we are not just LLM equivalent representation on biological circuit boards.