Comment on I'm looking for an article showing that LLMs don't know how they work internally

<- View Parent
just_another_person@lemmy.world ⁨4⁩ ⁨days⁩ ago

It’s a developer option that isn’t generally available on consumer-facing products. It’s literally just a debug log that outputs the steps to arrive at a response, nothing more.

It’s not about novel ideation or reasoning (programmatic neural networks don’t do that), but just an output of statistical data that says “Step was 90% certain, Step 2 was 89% certain…etc”

source
Sort:hotnewtop