I’m suggesting that if we don’t even understand how consciousness works for ourselves, we cannot make claims about how it will look for other things.
So is a rock conscious? I guess we’ll never know.
Comment on LLMs develop their own understanding of reality as their language abilities improve
Deceptichum@quokk.au 2 months agoNo, I’m not suggesting that.
I’m suggesting that if we don’t even understand how consciousness works for ourselves, we cannot make claims about how it will look for other things.
Deterministically free will does not exist, if we cannot exercise free will we cannot have independent thoughts just the same as a machine.
Truth is we don’t really know shit, we’re biological machines that are able to think they’re in control of themselves.
I’m suggesting that if we don’t even understand how consciousness works for ourselves, we cannot make claims about how it will look for other things.
So is a rock conscious? I guess we’ll never know.
I suspect others are talking about “thinking” only objectively.
B) If a LLM had a subjective experience when given input presumably it has none when all processes are stopped (subjective an unverifiable).
A) If a LLM has no input then there are no processes going on at all which could be described as thinking (objective: what is the program doing).
atrielienz@lemmy.world 2 months ago
Okay. Feed a new species that hasn’t been named yet into an LLM. Does it name that new creature? Can it decide what family or phylum etc it belongs? Does it pick up the specific attributes of that new species?
Deceptichum@quokk.au 2 months ago
It might even be able to pick those things out, I certainly couldn’t.
atrielienz@lemmy.world 2 months ago
That is very deliberately not in the spirit of the question I asked. It’s almost like you’re intent on misunderstanding on purpose just so you can feel like you’re right.
Deceptichum@quokk.au 2 months ago
You asked if it could do I task I wasn’t even capable of doing, and this was your assessment of consciousness.