You might benefit from watching Hinton’s lecture; much of it details technical reasons why digital is much much better than analog for intelligent systems
BTW that is the opposite of what he set out to prove
He says the facts forced him to change his mind
Trainguyrom@reddthat.com 4 months ago
That and the way companies have been building AI they have been doing so little to optimize compute to instead try to get the research out faster because that’s what is expected in this bubble. I’m absolutely fully expecting to see future research finding plenty of ways to optimize these major models.
But also R&D has been entirely focused on digital chips I would not be at all surprised if there were performance and/or efficiency gains to be had in certain workloads by shifting to analog circuits