Comment on Cornell's world-first 'microwave brain' computes differently
ignirtoq@fedia.io 1 day ago
My understanding of why digital computers rose to dominance was not any superiority in capability but basically just error tolerance. When the intended values can only be "on" or "off," your circuit can be really poor due to age, wear, or other factors, but if it's within 40% of the expected "on" or "off" state, it will function basically the same as perfect. Analog computers don't have anywhere near tolerances like that, which makes them more fragile, expensive, and harder to scale production.
I'm really curious if the researchers address any of those considerations.
NaibofTabr@infosec.pub 1 day ago
Analog computers were also bulkier, had more mechanical complexity, and required higher power to operate and generated more heat as a consequence. The Heathkit EC-1 logic circuits operated at 0-100V. There are some real physics problems with scaling analog circuits up to hugger complexity.