I would love to believe that AI is, but the current technology we are using is just not there yet. Until I see irrefutable evidence that LLMs are sentient, I am going to remain skeptical.
Believing that what we currently have is sentient and possibly new life is falling for the marketing ploys of the corpos trying to make massive amounts of money off investors.
algocademy.com/…/why-ai-can-follow-logic-but-cant…
AI systems are fundamentally limited by their training data. They cannot truly create logic that goes beyond what they’ve been exposed to during training. While they can combine existing patterns in new ways, giving the appearance of creativity, they cannot make the kind of intuitive leaps that characterize human innovation.
Opinionhaver@feddit.uk 1 week ago
LLMs are AI. While they’re not generally intelligent, they still fall under the umbrella of artificial intelligence. AGI is a subset of AI. Sentience, on the other hand, has nothing to do with it. It’s entirely conceivable that even an AGI system could lack any form of subjective experience while still outperforming humans on most - if not all - cognitive tasks.