That and the way companies have been building AI they have been doing so little to optimize compute to instead try to get the research out faster because that’s what is expected in this bubble. I’m absolutely fully expecting to see future research finding plenty of ways to optimize these major models.
But also R&D has been entirely focused on digital chips I would not be at all surprised if there were performance and/or efficiency gains to be had in certain workloads by shifting to analog circuits