Comment on [deleted]

<- View Parent
Quatity_Control@lemm.ee ⁨1⁩ ⁨year⁩ ago

Not really. The LLMs use tokens instead of actual words to understand the words. There’s a layer of disassociation. That’s different to taking pre existing knowledge, understanding it, and using it to divine more knowledge.

source
Sort:hotnewtop