Comment on Study finds that Chat GPT will cheat when given the opportunity and lie to cover it up later.

<- View Parent
DarkGamer@kbin.social ⁨11⁩ ⁨months⁩ ago

People invent false memories and confabulate all the time without even being "aware" of it. I wouldn't be surprised if the vast majority of "lies" that humans tell have no intentionality behind them.

Humans understand symbology of concepts as they relate to the real world. If I stole a cookie from the cookie jar, and someone asked if I took one, I would understand that saying "no" would mean that I was misrepresenting reality, and therefore lying.

LLMs have no idea what a cookie is, what taking one means, or that saying one thing and doing another implies a lie. It just sees lists of words and returns them in an order it thinks would be a meaningful reply. It does not understand what words mean, what lying means, or have any idea how to classify anything as such. It just figures out that "did you take a cookie from the cookie jar" should return a series of words in an order like "yes, I took a cookie," or, "no I never took a cookie," because those fit the patterns matched in the training data.

Essentially it's the Chinese room problem. There is no understanding or intentionality .

source
Sort:hotnewtop