Comment on oh ok

<- View Parent
REDACTED@infosec.pub ⁨9⁩ ⁨hours⁩ ago

Ehh, you obviously only understand LLMs on a very basic level with knowledge from 2021. This is like explaining jet engines by “air goes thru, plane moves forward”. Technically correct, but criminally undersimplified. They can very much decide to lie during reasoning phase.

In OPs image, you can clearly see it decided to make shit up because it reasonates that’s what human wants to hear. That’s quite rare example actually, I believe most models would default to “I’m an LLM model, I don’t have dark secrets”

source
Sort:hotnewtop