Comment on I'm looking for an article showing that LLMs don't know how they work internally

<- View Parent
theunknownmuncher@lemmy.world ⁨4⁩ ⁨days⁩ ago

It’s true that LLMs aren’t “aware” of what internal steps they are taking, so asking an LLM how they reasoned out an answer will just output text that statistically sounds right based on its training set, but to say something like “they can never reason” is provably false.

Its obvious that you have a bias and desperately want reality to confirm it, but there’s been significant research and progress in tracing internals of LLMs, that show logic, planning, and reasoning. Neural networks and very powerful, after all, you are one too. Can you reason?

source
Sort:hotnewtop