Comment on Bcachefs creator claims his custom LLM is 'fully conscious'
sentient_loom@sh.itjust.works 1 day ago
fully conscious according to any test I can think of
There’s no such thing as an actual test for consciousness in machines. We can do tests on animals to see if their sensory experience includes self-awareness, but we’re already operating on the assumption that they have feelings and sensory experience because they have a brain and nervous system like us, and they’re all directly related to us (as all organisms are). But that’s totally different from designing a machine which mimics (or predicts/auto-completes) our observable behavior and then assuming that it “doesn’t like” something or does anything “for fun.”
What sucks is that some idiots are going to start falling for this. And eventually software will be given human rights, which actually means that the software’s owners will have extra rights compared to the rest of us.
madrabeagdubh@piefed.social 1 day ago
Yeah, that’s as far as I’ve been able to go, thinking about this. To me, it’s clear that non-human animals are conscious. But, we treat them like raw materials, for reasons which fall apart immediatly in debate. AI might not be conscious the way a pig or a duck is. But it seems more conscious than a cup of sand or a box of crayons.
sentient_loom@sh.itjust.works 1 day ago
Image
Peruvian_Skies@sh.itjust.works 1 day ago
“Seems” being the operating word here. But children think that Muopets are conscious. The most famous work of fantasy fiction is about a malevolent piece of jewelry. Humans are very good at attributing consciousness to non-conscious entities. We are easily fooled in this respect.
Even if some putative AI may be conscious, an LLM is just something that looks up words in a database with probability weights attached. This technology cannot lead to consciousness.
Telorand@reddthat.com 1 day ago
Are you talking about The Pearl, by chance? It’s one I haven’t read, yet, but if you’re talking about another story, I’d like to read that, too!
Peruvian_Skies@sh.itjust.works 1 day ago
I was referring to The Lord of the Rings.
Iconoclast@feddit.uk 1 day ago
That would mean it feels like something to be an LLM. I don’t see any reason to think that. I’m not going to claim it absolutely is not because I couldn’t possibly know but I’m about as sure of that than I’m sure that it is like something to be my pet gerbil.
Hackworth@piefed.ca 1 day ago
We have precedent for dealing with things within our own imaginations that seem to have autonomy. Authors commonly talk about their characters seeming to take on a life of their own over time. Dream characters can honestly surprise the dreamer. The esoteric traditions of invocation/evocation can be viewed as an intentional applications of this feature in semantic/latent space.
But if the idea is that LLMs are a kind of external imagination, the question isn’t really whether or not the characters roleplayed during inference are conscious. They’re no more aware than the people in our dreams. The question is, as you say, what is it like to be those layers of software neurons in between the word generations. Can you have an imagination without an imaginer? In other words, is there a dreamer?
If the answer is no, case closed, relatively tidy. If the answer is yes, it’s a truly alien kind of consciousness. Embodiment comes with a bunch of stuff that an LLM has absolutely no access to. Generally speaking, we find it difficult to put ourselves in the shoes of other humans, much less animals, plants/fungii. And they’re embodied! LLMs are nothing like us, and they’re certainly not gendered.
Iconoclast@feddit.uk 1 day ago
I’ve honestly never considered before whether it could be like something to be a character in my dream - if it’s part of the same consciousness. Doesn’t seem obvious that it couldn’t be.
And my personal view is that the answer is definitely no. There’s no dreamer. The dream is appearing in the consciousness of a biological being with my genes, history, and memories that’s currently in a state of sleep.
This comes with other ramifications too. There’s also no decision-maker - it’s an illusion.