Third, it would need free will.
I strongly disagree there. I argue that not even humans have free will, yet we’re generally intelligent so I don’t see why AGI would need it either. In fact, I don’t even know what true free will would look like. There are only two reasons why anyone does anything: either you want to or you have to. There’s obviously no freedom in having to do something but you can’t choose your wants and not-wants either. You helplessly have the beliefs and preferences that you do. You didn’t choose them and you can’t choose to not have them either.
orb360@lemmy.ca 1 week ago
The human mind isn’t infinitely complex. Consciousness has to be a tractable problem imo. I watched Westworld so I’m something of an expert on the matter.