“than”…
IF THEN
MORE THAN
Comment on We need to stop pretending AI is intelligent
Simulation6@sopuli.xyz 2 days agoThe book The Emperors new Mind is old (1989), but it gave a good argument why machine base AI was not possible. Our minds work on a fundamentally different principle then Turing machines.
“than”…
IF THEN
MORE THAN
Our minds work on a fundamentally different principle then Turing machines.
Is that an advantage, or a disadvantage? I’m sure the answer depends on the setting.
Knock_Knock_Lemmy_In@lemmy.world 2 days ago
It’s hard to see that books argument from the Wikipedia entry, but I don’t see it arguing that intelligence needs to have senses, flesh, nerves, pain and pleasure. It’s just saying computer algorithms are not what humans use.
Simulation6@sopuli.xyz 1 day ago
I think what he is implying is that current computer design will never be able to gain consciousness. Maybe a fundamentally different type of computer can, but is anything like that even on the horizon?
jwmgregory@lemmy.dbzer0.com 1 day ago
possibly.
current machines aren’t really capable of what we would consider sentience because of the von neumann bottleneck.
simply put, computers consider memory and computation separate tasks leading to an explosion in necessary system resources for tasks that would be relatively trivial for a brain-system to do, largely due to things like buffers and memory management code. lots of this is hidden from the engineer and end user these days so people aren’t really super aware of exactly how fucking complex most modern computational systems are.
this is why if, for example, i threw a ball at you you will reflexively catch it, dodge it, or parry it; and your brain will do so for an amount of energy similar to that required to power a simple LED. this is a highly complex physics calculation ran in a very short amount of time for an incredibly low amount of energy relative to the amount of information in the system. the brain is capable of this because your brain doesn’t store information in a chest and later retrieve it like contemporary computers do. brains are turing machines, they just aren’t von neumann machines. in the brain, information is stored… within the actual system itself. the mechanical operation of the brain is so highly optimized that it likely isn’t physically possible to make a much more efficient computer without venturing into the realm of strange quantum mechanics. even then, the verdict is still out on whether or not natural brains don’t do something like this to some degree as well. we know a whole lot about the brain but it seems some damnable incompleteness theorem-adjacent affect prevents us from easily comprehending the actual mechanics of our own brains from inside the brain itself in a wholistic manner.
that’s actually one of the things AI and machine learning might be great for. if it is impossible to explain the human experience from inside of the human experience… then we must build a non-human experience and ask its perspective on the matter - again, simply put.
Knock_Knock_Lemmy_In@lemmy.world 1 day ago
I believe what you say. I don’t believe that is what the article is saying.
Asetru@feddit.org 1 day ago
If you can bear the cringe of the interviewer, there’s a good interview with Penrose that goes on the same direction: m.youtube.com/watch?v=e9484gNpFF8