Different person here.
For me the big disqualifying factor is that LLMs don’t have any mutable state.
We humans have a part of our brain that can change our state from one to another as a reaction to input (through hormones, memories, etc). Some of those state changes are reversible, others aren’t. Some can be done consciously, some can be influenced consciously, some are entirely subconscious. This is also true for most animals we have observed. We can change their states through various means. In my opinion, this is a prerequisite in order to feel anything.
Once we use models with bits dedicated to such functionality, it’ll become a lot harder for me personally to argue against them having “feelings”, especially because in my worldview, continuity is not a prerequisite, and instead mostly an illusion.
UnculturedSwine@lemmy.dbzer0.com 4 weeks ago
Feeling is analog and requires an actual nervous system which is dynamic. LLMs exist in a static state that is read from and processed algorithmically. It is only a simulacrum of life and feeling. It only has some of the needed characteristics. Where that boundary exists though is hard to determine I think. Admittedly we still don’t have a full grasp of what consciousness even is. Maybe I’m talking out my ass but that is how I understand it.
iopq@lemmy.world 4 weeks ago
You just posted random words like dynamic without explanation
Enkers@sh.itjust.works 4 weeks ago
Not them, but static in this context means it doesn’t have the ability to update its own model on the fly. If you want a model to learn something new, it has to be retrained.
iopq@lemmy.world 4 weeks ago
That makes more sense
OccasionallyFeralya@lemmy.ml 4 weeks ago
You’re in a programming board and you don’t understand static/dynamic states?
iopq@lemmy.world 4 weeks ago
Not in a hand wavy way from the last post. I understand that Python is dynamically typed, which would have nothing to do with the topic