LLMs are just fast sorting and probability, they have no way to ever develop novel ideas or comprehension.
The system he’s talking about is more about using NNL, which builds new relationships to things that persist. It’s deferential relationship learning and data path building. Doesn’t exist yet, so if he has some ideas, it may be interesting. Also more likely to be the thing that kills all human.
tomiant@piefed.social 1 month ago
Look, AGI would require basically a human brain. LLMs are a very specific subset mimicking a (important) part of the brain- our language module. There’s more, but I got interrupted by a drunk guy who needs my attention, I’ll be back.
krooklochurm@lemmy.ca 1 month ago
WHAT HAPPENED WITH THE DRHNK DUDE?
tomiant@piefed.social 1 month ago
He offered me a job.
krooklochurm@lemmy.ca 1 month ago
Did you accept?