Comment on AI could already be conscious. Are we ready for it?
amelia@feddit.org 1 week agoYou talk like you know what the requirements for consciousness are. How do you know? As far as I know that’s an unsolved philosophical and scientific problem.
MagicShel@lemmy.zip 1 week ago
I have a set of attributes that I associate with consciousness. We can disagree in part, but if your definition is so broad as to include math formulas there isn’t every common ground for us to discuss them.
If you want to say contemplation/awareness of self isn’t part of it then I guess I’m not very precious about it the way I would be over a human-like perception of self, then fine people can debate what ethical obligations we have to an ant-like consciousness when we can achieve even that, but we aren’t there yet. LLMs are nothing but a process of transforming input to output. I think consciousness requires rather more than that or we wind up with erosion being considered a candidate for consciousness.
So I’m not the authority, but if we don’t adhere to some reasonable layman’s definition it quickly gets into weird wankery that I don’t see any value in exploring.
amelia@feddit.org 3 days ago
AI isn’t math formulas though. AI is a complex dynamic system reacting to external input. There is no fundamental difference here to a human brain in that regard imo. It’s just that the processing isn’t happening in biological tissue but in silicon. Is it way less complex that a human? Sure. Is there a fundamental qualitative difference? I don’t think so.