ignirtoq
@ignirtoq@fedia.io
This is a remote user, information on this page may be incomplete. View at Source ↗
- Comment on Cornell's world-first 'microwave brain' computes differently 1 day ago:
My understanding of why digital computers rose to dominance was not any superiority in capability but basically just error tolerance. When the intended values can only be "on" or "off," your circuit can be really poor due to age, wear, or other factors, but if it's within 40% of the expected "on" or "off" state, it will function basically the same as perfect. Analog computers don't have anywhere near tolerances like that, which makes them more fragile, expensive, and harder to scale production.
I'm really curious if the researchers address any of those considerations.