As someone who’s had two kids since AI really vaulted onto the scene, I am enormously confused as to why people think AI isn’t or, particularly, can’t be sentient. I hate to be that guy who pretend to be the parenting expert online, but most of the people I know personally who take the non-sentient view on AI don’t have kids. The other side usually does.
When it writes an answer to a question, it literally just guesses which letter and word will come next in a sequence – based on the data it’s been trained on.
People love to tout this as some sort of smoking gun. That feels like a trap. Obviously, we can argue about the age children gain sentience, but my year and a half old daughter is building an LLM with pattern recognition, tests, feedback, hallucinations. My son is almost 5, and he was and is the same. He told me the other day that a petting zoo came to the school. He was adamant it happened that day. I know for a fact it happened the week before, but he insisted. He told me later that day his friend’s dad was in jail for threatening her mom. That was true, but looked to me like another hallucination or more likely a misunderstanding.
And as funny as it would be to argue that they’re both sapient, but not sentient, I don’t think that’s the case. I think you can make the case that without true volition, AI is sentient but not sapient. I’d love to talk to someone in the middle of the computer science and developmental psychology Venn diagram.
terrific@lemmy.ml 4 days ago
I’m a computer scientist that has a child and I don’t think AI is sentient at all. Even before learning a language, children have their own personality and willpower which is something that I don’t see in AI.
I left a well paid job in the AI industry because the mental gymnastics required to maintain the illusion was too exhausting. I think most people in the industry are aware at some level that they have to participate in maintaining the hype to secure their own jobs.
The core of your claim is basically that “people who don’t think AI is sentient don’t really understand sentience”. I think that’s both reductionist and, frankly, a bit arrogant.
jpeps@lemmy.world 4 days ago
Couldn’t agree more - there are some wonderful insights to gain from seeing your own kids grow up, but I don’t think this is one of them.
Kids are certainly building a vocabulary and learning about the world, but LLMs don’t learn.
stephen01king@lemmy.zip 4 days ago
LLMs don’t learn because we don’t let them, not because they can’t. It would be too expensive to re-train them on every interaction.
terrific@lemmy.ml 4 days ago
I know it’s part of the AI jargon, but using the word “learning” to describe the slow adaptation of massive arrays of single precision numbers to some loss function, is a very generous interpretation of that word, IMO.