Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?
magic_lobster_party@kbin.run 6 months ago
There’s the Chinese Room argument, which is a bit related:
Comment on What is a good eli5 analogy for GenAI not "knowing" what they say?
magic_lobster_party@kbin.run 6 months ago
There’s the Chinese Room argument, which is a bit related:
Asifall@lemmy.world 6 months ago
I always thought the Chinese Room argument was kinda silly. It’s predicated on the idea that humans have some unique capacity to understand the world that can’t be replicated by a syntactic system, but there is no attempt made to actually define this capacity.
The whole argument depends on our intuition that we think and know things in a way inanimate objects don’t. In other words, it’s a tautology to draw the conclusion that computers can’t think from the premise that computers can’t think.