Yes, both systems - the human brain and an LLM - assimilate and organize human written languages in order to use it for communication. An LLM is very little else beyond this. It is then given rules (using those written languages) and then designed to create more related words when given input. I just don’t find it convincing that an ML algorithm designed explicitly to mimic human written communication in response to given input “understands” anything. No matter *how convincingly" an algorithm might reproduce a human voice - perfectly matching intonation and inflexion when given text to read - if I knew it was an algorithm designed to do it as convincingly as possible I wouldn’t say it was capable of the feeling it is able to express.
The only thing in favor of sentience is that the ML algorithms modify themselves and end up being a black box - so complex with no way to represent them that they are impossible for humans to comprehend. Could it somehow have achieved sentience? Technically, yes, because we don’t understand how they work. We are just meat machines, after all.
UnculturedSwine@lemmy.dbzer0.com 10 months ago
Feeling is analog and requires an actual nervous system which is dynamic. LLMs exist in a static state that is read from and processed algorithmically. It is only a simulacrum of life and feeling. It only has some of the needed characteristics. Where that boundary exists though is hard to determine I think. Admittedly we still don’t have a full grasp of what consciousness even is. Maybe I’m talking out my ass but that is how I understand it.
iopq@lemmy.world 10 months ago
You just posted random words like dynamic without explanation
OccasionallyFeralya@lemmy.ml 10 months ago
You’re in a programming board and you don’t understand static/dynamic states?
iopq@lemmy.world 10 months ago
Not in a hand wavy way from the last post. I understand that Python is dynamically typed, which would have nothing to do with the topic
Enkers@sh.itjust.works 10 months ago
Not them, but static in this context means it doesn’t have the ability to update its own model on the fly. If you want a model to learn something new, it has to be retrained.
iopq@lemmy.world 10 months ago
That makes more sense