Comment on Study finds that Chat GPT will cheat when given the opportunity and lie to cover it up later.

<- View Parent
DarkGamer@kbin.social ⁨11⁩ ⁨months⁩ ago

It has no fundamental grasp of concepts like truth, it just repeats words that simulate human responses. It's glorified autocomplete that yields impressive results. Do you consider your auto complete to be lying when it picks the wrong word?

If making it pretend to be a stock picker and putting it under pressure makes it return lies, that's because it was trained on data that indicates that's the right set of words response for such a query.

Also, because large language models are probabilistic. You could ask it the same question over and over again and get totally different responses each time, some of which are inaccurate. Are they lies though? For a creature to lie it has to know that it's returning untruths.

source
Sort:hotnewtop