arc99@lemmy.world 3 weeks ago
Hardly surprising. Llms aren’t -thinking- they’re just shitting out the next token for any given input of tokens.
arc99@lemmy.world 3 weeks ago
Hardly surprising. Llms aren’t -thinking- they’re just shitting out the next token for any given input of tokens.
stevedice@sh.itjust.works 3 weeks ago
That’s exactly what thinking is, though.
arc99@lemmy.world 3 weeks ago
An LLM is an ordered series of parameterized / weighted nodes which are fed a bunch of tokens, and millions of calculations later result generates the next token to append and repeat the process. It’s like turning a handle on some complex Babbage-esque machine. LLMs use a tiny bit of randomness when choosing the next token so the responses are not identical each time.
But it is not thinking. Not even remotely so. It’s a simulacrum.
stevedice@sh.itjust.works 2 weeks ago
I know what am LLM is doing. You don’t know what your brain is doing.