Reading the article, I learned that the author does not really have a clue what he is talking about.
A mechanical clock is anything but analog. Look up what an escape wheel is for if you doubt it.
For “analog is easier” keep in mind that it is very hard to get chip based circuits do precisely reproducable analog behavior. Indeed, this is one of the main reasons why we have digital computer chips: the output of the circuit is sufficiently unambiguous.
And “can run things in parallel” - That’s what e.g. FPGAs are for. One if my designs runs audio compression on 32 channels with a meagre 12MHz clock, among many, many other tasks. All at the same time.
ignirtoq@fedia.io 1 day ago
My understanding of why digital computers rose to dominance was not any superiority in capability but basically just error tolerance. When the intended values can only be "on" or "off," your circuit can be really poor due to age, wear, or other factors, but if it's within 40% of the expected "on" or "off" state, it will function basically the same as perfect. Analog computers don't have anywhere near tolerances like that, which makes them more fragile, expensive, and harder to scale production.
I'm really curious if the researchers address any of those considerations.
NaibofTabr@infosec.pub 1 day ago
Analog computers were also bulkier, had more mechanical complexity, and required higher power to operate and generated more heat as a consequence. The Heathkit EC-1 logic circuits operated at 0-100V. There are some real physics problems with scaling analog circuits up to hugger complexity.