The layer of disassociation is present w/ humans speaking different languages too though, right? My point is that once we can understand each other, we are all building on what already exists
Comment on [deleted]
Quatity_Control@lemm.ee 1 year agoNot really. The LLMs use tokens instead of actual words to understand the words. There’s a layer of disassociation. That’s different to taking pre existing knowledge, understanding it, and using it to divine more knowledge.
macallik@kbin.social 1 year ago
Quatity_Control@lemm.ee 1 year ago
Language is not disassociation.
rufus@discuss.tchncs.de 1 year ago
Yeah. We’re all embedded in a context and in a world surrounding us. So are computer language models.
slazer2au@lemmy.world 1 year ago
RFC1925 6a at it again