This is why some people (me included) don’t believe the current form of quantum computers we are researching can actually work in the real world.
And then there’s some people (me included) who bet a whole beer on quantum computers being inherently impossible. Not the “get them to calculate” part, but the “shave a factor off the asymptotics of computers using ordinary physics” part. The argument is simple: It could very well be that the more data you try to squeeze into a qbit, the fuzzier the result is going to get, so if you put ten million numbers each into two qbits and somehow make the qbits add them, you’ll get ten million results that are ten million times fuzzier than if you’d put in a single number. To the best of my knowledge I’ve not yet lost that bet, it has not been demonstrated that researchers won’t run against a wall, there, essentially that the universe has a limited computation capacity per volume of space (or however you measure things at that scale).
Other fun things to annoy people with: Claim that deciding between P = NP and P /= NP is undecidable.
AliasAKA@lemmy.world 6 months ago
I think in general the goal is not to stuff more information into fewer qubits, but to stabilize more qubits so you can hold more information. The problem is in the physics of stabilizing that many qubits for long enough to run a meaningful calculation.
barsoap@lemm.ee 6 months ago
Argh it’s been a while. The question is whether an n-qbit system actually has 2^n^ distinguishable states for arbitrary values of n: Such a system might work up to a certain number, but then lose coherence once you try to exceed what the universe can actually compute. As far as I know we simply don’t know because noone has yet built a system that actually pushes boundaries in earnest.
It would still mean ludicrously miniaturised computing, in fact, minimised to a maximum extent, but it would not give the asymptotic speedup cryptologists have nightmares about.