The thing which makes digital chips so much better than analog chips is something both you and the article are missing: noise. A digital chip is very robust against noise, as long as the noise in one step isn’t too big so it causes a bitflip immediately the stable configuration will pull the voltage level back and no information is lost. Not so with analog logic, since the information is continuous every step which introduces noice (which is basically every step) will cause loss of information. Go a few levels of logic deep and all you’ve got left is noise.
vacuumflower@lemmy.sdf.org 2 weeks ago
Which you often don’t need. Mechanical computers for aircraft operation, or hydraulic computers for modeling something nuclear, things like that.
But there’s nothing “century-old” about all this. They might have non-deterministic steps for some calculation where determinism is not needed (like if you need to ray-trace a sphere, you’ll do fine with a bit different dithering each time) and without it better performance is achievable.
The idea seems to make sense, just - it will never be revolutionary.