Ultimately, the arguments over quantum mechanics have much bigger stakes: what reality is. The basic problem is that the theory tells us what we can expect to observe if we make measurements of a quantum system such as an atom or an electron. It doesn’t tell us how the world is, only what we’ll see if we look. Quantum uncertainty, the physicist and philosopher Jeffrey Bub of the University of Maryland told me, “doesn’t simply represent ignorance about what is the case, but a new sort of ignorance about something that doesn’t yet have a truth value, something that simply isn’t one way or the other before we measure.”
No we had this link before. Unless that guy can explain what happens in a Bell experiment whose endpoints are in separate galaxies, there’s still a mystery.
pcalau12i@lemmygrad.ml 9 hours ago
There isn’t a mystery, there is just physicists being mystical for no justified reason.
There is no evidence quantum systems exist in two states until you look. Physicists just endlessly gaslight each other into believing a clearly statistical theory describing a stochastic process and thus only gives you probability distributions for the results somehow has nothing to do with probability at all and only “collapses” into the probabilities when you look. It’s all a grand delusion.
The probabilities are always there from the get-go. If you separate the quantum state into its real and imaginary parts and then translate to polar form and you see that it contains two degrees of freedom, one of those degrees of freedom is just a vector of real-valued probabilities for the current configuration of the system, and the other vector is a real-valued vector of relational phases between the objects in the system.
The latter evolves deterministically whereas the former evolves stochastically, and the stochastic evolution of the former can be influenced by the ontic state of the latter.
The update rule for a classical probabilistic information is:
Where p⃗ is your probability distribution and Γ is a stochastic matrix.
The update rule for a quantum computer can literally just be expressed as:
Where the additional c⃗ is a coherence term that is derivative of a function on φ⃗ where φ⃗ is the deterministically evolving vector of relational phases. Quantum computers are not magic, they are just bits that evolve stochastically according to a modified stochastic rule that deviates from classical stochastic processes by the non-linear coherence term with dependence upon the deterministic evolution of φ⃗, requiring you to have to both keep track both of φ⃗ and p⃗.
There is no physical “collapse” of anything when you make a measurement. Since p⃗ is a probability distribution then you can perform a Bayesian knowledge update on it when you make a measurement, using Bayes’ theorem. Nothing is mysterious about that. That’s literally all it is. The quantum state ψ is complex-valued, meaning it represents two degrees of freedom in the system, one of those degrees of freedom is p⃗=|ψ|², and the other is arg(ψ)=φ⃗. When you perform a measurement, you ONLY have to update the degree of freedom associated with p⃗. You don’t have to touch the other, and this fact is guaranteed by U(1) gauge symmetry.
There is only a mystery if you delude yourself into believing that quantum mechanics is not just a non-classical probabilistic theory. If you just accept it from the get-go, then the only difficult question is how is it that classical stochastic dynamics arises from quantum stochastic dynamics on macroscopic scales, but this is already solved via decoherence.
But if you delude yourself into believing that quantum mechanics is not a stochastic theory at all, when it clearly is, then decoherence doesn’t suffice, because you would make the mistake of interpreting the entirety of ψ, both of its degrees of freedom, as physical, meaning you would be interpreting a probability distribution of p⃗ as physical. If you interpret a probability distribution as physical, then you end up interpreting the branching paths in the probability tree as physically branching paths, even though that’s clearly not what we observe as we only ever observe a single outcome, and decoherence does not get you to a single outcome, only a classical distribution of outcomes.
This transformation of p⃗ into a physical object is then “resolved” either by proposing p⃗ “collapses” down into a definite outcome when you look at it based on values of p⃗, or by claiming that the observer themselves physically branches as well into a multiverse. You end up with two absurdities because p⃗ is not a physical object. It’s a probability distribution of the system’s configuration.
The mass delusion that quantum mechanics has nothing to do with statistics or probability theory has somewhat of its origins in Bell’s theorem, where Bell proved that it is impossible for there to be an ontic state of the system when you’re not looking at it that is compatible with special relativity, and only if you marginalize out everything you aren’t looking at to isolate the measurement readout itself, only the measurement readouts are compatible with special relativity.
This led physicists to then argue for dropping anything from the model that is not the measurement readouts, so the only ontic states are the measurement readouts and ψ. They thus have to claim ψ is the physical state of the system when you are not looking or else their theory no longer has objective reality in it at all.
But this argument fails for a simple reason. When special relativity was first introduced by Einstein in 1905, it was mathematically equivalent to a theory without relativity proposed by Hendrik Lorentz in 1904. Hence, we know for a fact that a theory does not need to be relativity to make all the same empirical predictions as relativity. Thus, you can get around Bell’s theorem and build a model with ontic states in a similar way and make all the same predictions as relativistic quantum mechanics, such as Hrvoje Nikolic’s model.
However, I don’t actually advocate for such models, but the fact that such models exist shows the very simple fact that a universe where particles have ontic states when you’re not looking at them is perfectly compatible with a universe that is relativistic when you marginalize out those ontic states and only look at measurement readouts. If the dynamics are stochastic, then this also explains why we do not include the ontic states in the model, not because they don’t exist, but because the stochastic dynamics prevents you from tracking them in the model.
We’ve known this for ages, but people are obsessed with using quantum mechanics for their own springboard of mysticism, and so they want to pretend it is “mysterious” to justify their beliefs in multiverses, some special role for “consciousness,” or what-have-you. If you just accept the bloody obvious reality that the theory gives you a statistical distribution because it is a statistical theory, at least in part (φ⃗ evolves deterministically), then decoherence is the end of the story. There only seems like a mystery is still left when you adopt decoherence if you rejected that hte theory was statistical to begin with.