How can we ask “are we closer” if we don’t know what the destination is?
LLMs might still end up being an interesting special-purpose system, perhaps with fairly broad applications, but in a direction that’s different from where true AGI ends up coming from.