Comment on I'm looking for an article showing that LLMs don't know how they work internally

<- View Parent
theunknownmuncher@lemmy.world ⁨1⁩ ⁨week⁩ ago

There is a distinction between data and an action you perform on data (matrix maths, codec algorithm, etc.). It’s literally completely different.

Incorrect. You might want to take an information theory class before speaking on subjects like this.

LLMs are just tools not sentient or verging on sentient

Correct. No one claimed they are “sentient” (you actually mean “sapient”, not “sentient”, but it’s fine as most people mix those up. And no, LLMs are not sapient either, and sapience has nothing to do with reasoning or logic, you’re just moving the goalpost)

source
Sort:hotnewtop