I think this claim is both key to OP’s argument and importantly wrong:
But a wavefunction is just a way to embed any quantum system into a deterministic system
(the idea being that a wavefunction is just like a probability distribution, and treating the wavefunction as real is like treating the probability distribution of some perhaps-truly-stochastic thing as real).
The wavefunction in quantum mechanics is not like the probability distribution of (say) where a dart lands when you throw it at a dartboard. (In some but not all imaginable Truly Stochastic worlds, perhaps it’s like the probability distribution of the whole state of the universe, but OP’s intuition-pumping example seems to be imagining a case where A is some small bit of the universe.)
The reason why it’s not like that is that the laws describing the evolution of the system explicitly refer to what’s in the wavefunction. We don’t have any way to understand and describe what a quantum universe does other than in terms of the evolution of the wavefunction or something basically equivalent thereto.
Which, to my mind, makes it pretty weird to say that postulating that the wavefunction is what’s real is “going further away from quantum mechanics”. Maybe one day we’ll discover some better way to think about quantum mechanics that makes that so, but for now I don’t think we have a better notion of what being Truly Quantum means than to say “it’s that thing that wavefunctions do”.
I have the impression—which may well be very unfair—that at some early stage OP imbibed the idea that what “quantum” fundamentally means is something very like “random”, so that a system that’s deterministic is ipso facto less “quantum” than a system that’s stochastic. But that seems wrong to me. We don’t presently have any way to distinguish random from deterministic versions of quantum physics; randomness or something very like it shows up in our experience of quantum phenomena, but the fact that a many-worlds interpretation is workable at all means that that doesn’t tell us much about whether randomness is essential to quantum-ness.
So I don’t buy the claim that treating the wavefunction as real is a sort of deterministicating hack that moves us further away from a Truly Quantum understanding of the universe.
(And, incidentally, if we had a model of Truly Stochastic physics in which the evolution of the system is driven by what’s inside those probability distributions—why, then, I would rather like the idea of claiming that the probability distributions are what’s real, rather than just their outcomes.)
Something you and the OP might find interesting is one of those things that is basically equivalent to a wavefunction, but represented in different mathematics is a Wigner function. It behaves almost exactly like a classical probability distribution, for example it integrates up to 1. Bayes rule updates it when you measure stuff. However, in order for it to “do quantum physics” it needs the ability to have small negative patches. So quantum physics can be modelled as a random stochastic process, if negative probabilities are allowed. (Incidentally, this is often used as a test of “quantumness”: do I need negative probabilities to model it with local stochastic stuff? If yes, then it is quantum).
If you are interested in a sketch of the maths. Take W to be a completely normal probability distribution, describing what you know about some isolated, classical ,1d system. And take H to be the classical Hamiltonian (IE just a function for the system’s energy). Then, the correct way of evolving your probability distribution (for an isolated classical, 1D system) is:
˙W=H(←−∂∂x−→∂∂p−←−∂∂p−→∂∂x)W Where the arrows on the derivatives have the obvious effect of firing them either at H or W. The first pair of derivatives in the bracket is Newton’s Second law (rate of change of Energy (H) with respect to X is going to turn potential’s into Forces, and the rate of change with momentum on W then changes the momentum in proportion to the force), the second term is the definition of momentum (position changes are proportional to momentum).
Instead of going to operators and wavefunctions in Hilbert space, it is possible to do quantum physics by replacing the previous equation with:
˙W=2Hℏsin(ℏ2(←−∂∂x−→∂∂p−←−∂∂p−→∂∂x))W
Where sin is understood from Taylor series, so the first term (after the hbars/2 cancel) is the same as the first term for classical physics. The higher order terms (where the hbars do not fully cancel) can result in W becoming negative in places even if it was initially all-positive. Which means that W is no longer exactly like a probability distribution, but is some similar but different animal. Just to mess with us the negative patches never get big enough or deep enough for any measurement we can make (limited by uncertainty principle) to have a negative probability of any observable outcome. H is still just a normal function of energy here.
Also, the OP is largely correct when they say “destructive interference is the only issue”. However, in the language of probability distributions dealing with that involves the negative probabilities above. And once they go negative they are not proper probabilities any more, but some new creature. This, for example, stops us from thinking of them as just our ignorance. (Although they certainly include our ignorance).
I’d expect Wigner functions to be less ontologically fundamental than wavefunctions because a wavefunction into a real function in this way introduces a ton of redundant parameters, since now it’s a function of phase space instead of configuration space. But they’re still pretty cool.
Imagine you have a machine that flicks a classical coin and then makes either one wavefunction or another based on the coin toss. Your ordinary ignorance of the coin toss, and the quantum stuff with the wavefunction can be rolled together into an object called a density matrix.
There is a one-to-one mapping between density matrices and Wigner functions. So, in fact there are zero redundant parameters when using Wigner functions. In this sense they do one-better than wavefunctions, where the global phase of the universe is a redundant variable. (Density matrices also don’t have global phase.)
That is not to say there are no issues at all with assuming that Wigner functions are ontologically fundamental. For one, while Wigner functions work great for continuous variables (eg. position, momentum), Wigner functions for discrete variables (eg. Qubits, or spin) are a mess. The normal approach can only deal with discrete systems in a prime number of dimensions (IE a particle with 3 possible spin states is fine, but 6 is not.). If the number of dimensions is not prime weird extra tricks are needed.
A second issue is that the Wigner function, being equivalent to a density matrix, combines both quantum stuff and the ignorance of the observer into one object. But the ignorance of the observer should be left behind if we were trying to raise it to being ontologically fundamental, which would require some change.
Another issue with “ontologising” the Wigner function is that you need some kind of idea of what those negatives “really mean”. I spent some time thinking about “If the many worlds interpretation comes from ontologising the wavefunction, what comes from doing that to the Wigner function?” a few years ago. I never got anywhere.
Another issue with “ontologising” the Wigner function is that you need some kind of idea of what those negatives “really mean”. I spent some time thinking about “If the many worlds interpretation comes from ontologising the wavefunction, what comes from doing that to the Wigner function?” a few years ago. I never got anywhere.
Wouldn’t it also be many worlds, just with a richer set of worlds? Because with wavefunctions, your basis has to pick between conjugate pairs of variables, so your “worlds” can’t e.g. have both positions and momentums, whereas Wigner functions tensor the conjugate pairs together, so their worlds contain both positions and momentums in one.
In some but not all imaginable Truly Stochastic worlds, perhaps it’s like the probability distribution of the whole state of the universe, but OP’s intuition-pumping example seems to be imagining a case where A is some small bit of the universe.
Oops, I guess I missed this part when reading your comment. No, I meant for A to refer to the whole configuration of the universe.
But it’s a generic type; A could be anything. I had the functional programming mindset where it was to be expected that the Distribution type would be composed into more complex distributions.
The wavefunction in quantum mechanics is not like the probability distribution of (say) where a dart lands when you throw it at a dartboard. (In some but not all imaginable Truly Stochastic worlds, perhaps it’s like the probability distribution of the whole state of the universe, but OP’s intuition-pumping example seems to be imagining a case where A is some small bit of the universe.)
The reason why it’s not like that is that the laws describing the evolution of the system explicitly refer to what’s in the wavefunction. We don’t have any way to understand and describe what a quantum universe does other than in terms of the evolution of the wavefunction or something basically equivalent thereto.
In my view, the big similarity is in principle of superposition. The evolution of the system in a sense may depend on the wavefunction, but it is an extremely rigid sense which requires it to be invariant to chopping up a superposition to a bunch of independent pieces, or chopping up a simple state into an extremely pathological superposition.
I have the impression—which may well be very unfair—that at some early stage OP imbibed the idea that what “quantum” fundamentally means is something very like “random”, so that a system that’s deterministic is ipso facto less “quantum” than a system that’s stochastic. But that seems wrong to me. We don’t presently have any way to distinguish random from deterministic versions of quantum physics; randomness or something very like it shows up in our experience of quantum phenomena, but the fact that a many-worlds interpretation is workable at all means that that doesn’t tell us much about whether randomness is essential to quantum-ness.
It’s worth emphasizing that the OP isn’t really how I originally thought of QM. One of my earliest memories was of my dad explaining quantum collapse to me, and me reinventing decoherence by asking why it couldn’t just be that you got entangled with the thing you were observing. It’s only now, years later, that I’ve come to take issue with QM.
In my mind, there’s four things that strongly distinguish QM systems from ordinary stochastic systems:
Destructive interference
Principle of least action (you could in principle have this and the next in deterministic/stochastic systems, but it doesn’t fall out of the structure the ontology as easily, without additional laws)
Preservation of information (though of course since the universe is actually quantum, this means the universe doesn’t resemble a deterministic or stochastic system at the large scale, because we have thermodynamics and neither deterministic nor stochastic systems need thermodynamics)
Pauli exclusion principle (technically you could have this in a stochastic system too, but it feels quantum-mechanical because it can be derived from fermion products being anti-symmetric, and anti-symmetry only makes sense in quantum systems)
Almost certainly this isn’t complete, since I’m mostly autodidact (got taught a bit by my dad, read standard rationalist intros to quantum, like The Sequences and Scott Aaronson, took a mathematical physics course, and coded a few qubit simulations, binged some Wikipedia and Youtube). Of these, only destructive interference really seems like an obstacle, and only a mild one.
(And, incidentally, if we had a model of Truly Stochastic physics in which the evolution of the system is driven by what’s inside those probability distributions—why, then, I would rather like the idea of claiming that the probability distributions are what’s real, rather than just their outcomes.)
I would say this is cruxy for me, in the sense that if I didn’t believe Truly Stochastic systems were ontologically fine, then I would take similar issue with Truly Quantum systems.
I think this claim is both key to OP’s argument and importantly wrong:
(the idea being that a wavefunction is just like a probability distribution, and treating the wavefunction as real is like treating the probability distribution of some perhaps-truly-stochastic thing as real).
The wavefunction in quantum mechanics is not like the probability distribution of (say) where a dart lands when you throw it at a dartboard. (In some but not all imaginable Truly Stochastic worlds, perhaps it’s like the probability distribution of the whole state of the universe, but OP’s intuition-pumping example seems to be imagining a case where A is some small bit of the universe.)
The reason why it’s not like that is that the laws describing the evolution of the system explicitly refer to what’s in the wavefunction. We don’t have any way to understand and describe what a quantum universe does other than in terms of the evolution of the wavefunction or something basically equivalent thereto.
Which, to my mind, makes it pretty weird to say that postulating that the wavefunction is what’s real is “going further away from quantum mechanics”. Maybe one day we’ll discover some better way to think about quantum mechanics that makes that so, but for now I don’t think we have a better notion of what being Truly Quantum means than to say “it’s that thing that wavefunctions do”.
I have the impression—which may well be very unfair—that at some early stage OP imbibed the idea that what “quantum” fundamentally means is something very like “random”, so that a system that’s deterministic is ipso facto less “quantum” than a system that’s stochastic. But that seems wrong to me. We don’t presently have any way to distinguish random from deterministic versions of quantum physics; randomness or something very like it shows up in our experience of quantum phenomena, but the fact that a many-worlds interpretation is workable at all means that that doesn’t tell us much about whether randomness is essential to quantum-ness.
So I don’t buy the claim that treating the wavefunction as real is a sort of deterministicating hack that moves us further away from a Truly Quantum understanding of the universe.
(And, incidentally, if we had a model of Truly Stochastic physics in which the evolution of the system is driven by what’s inside those probability distributions—why, then, I would rather like the idea of claiming that the probability distributions are what’s real, rather than just their outcomes.)
Something you and the OP might find interesting is one of those things that is basically equivalent to a wavefunction, but represented in different mathematics is a Wigner function. It behaves almost exactly like a classical probability distribution, for example it integrates up to 1. Bayes rule updates it when you measure stuff. However, in order for it to “do quantum physics” it needs the ability to have small negative patches. So quantum physics can be modelled as a random stochastic process, if negative probabilities are allowed. (Incidentally, this is often used as a test of “quantumness”: do I need negative probabilities to model it with local stochastic stuff? If yes, then it is quantum).
If you are interested in a sketch of the maths. Take W to be a completely normal probability distribution, describing what you know about some isolated, classical ,1d system. And take H to be the classical Hamiltonian (IE just a function for the system’s energy). Then, the correct way of evolving your probability distribution (for an isolated classical, 1D system) is:
˙W=H(←−∂∂x−→∂∂p−←−∂∂p−→∂∂x)W
Where the arrows on the derivatives have the obvious effect of firing them either at H or W. The first pair of derivatives in the bracket is Newton’s Second law (rate of change of Energy (H) with respect to X is going to turn potential’s into Forces, and the rate of change with momentum on W then changes the momentum in proportion to the force), the second term is the definition of momentum (position changes are proportional to momentum).
Instead of going to operators and wavefunctions in Hilbert space, it is possible to do quantum physics by replacing the previous equation with:
˙W=2Hℏsin(ℏ2(←−∂∂x−→∂∂p−←−∂∂p−→∂∂x))W
Where sin is understood from Taylor series, so the first term (after the hbars/2 cancel) is the same as the first term for classical physics. The higher order terms (where the hbars do not fully cancel) can result in W becoming negative in places even if it was initially all-positive. Which means that W is no longer exactly like a probability distribution, but is some similar but different animal. Just to mess with us the negative patches never get big enough or deep enough for any measurement we can make (limited by uncertainty principle) to have a negative probability of any observable outcome. H is still just a normal function of energy here.
(Wikipedia is terrible for this topic. Way too much maths stuff for my taste: https://en.wikipedia.org/wiki/Moyal_bracket)
Also, the OP is largely correct when they say “destructive interference is the only issue”. However, in the language of probability distributions dealing with that involves the negative probabilities above. And once they go negative they are not proper probabilities any more, but some new creature. This, for example, stops us from thinking of them as just our ignorance. (Although they certainly include our ignorance).
Neat!
I’d expect Wigner functions to be less ontologically fundamental than wavefunctions because a wavefunction into a real function in this way introduces a ton of redundant parameters, since now it’s a function of phase space instead of configuration space. But they’re still pretty cool.
Imagine you have a machine that flicks a classical coin and then makes either one wavefunction or another based on the coin toss. Your ordinary ignorance of the coin toss, and the quantum stuff with the wavefunction can be rolled together into an object called a density matrix.
There is a one-to-one mapping between density matrices and Wigner functions. So, in fact there are zero redundant parameters when using Wigner functions. In this sense they do one-better than wavefunctions, where the global phase of the universe is a redundant variable. (Density matrices also don’t have global phase.)
That is not to say there are no issues at all with assuming that Wigner functions are ontologically fundamental. For one, while Wigner functions work great for continuous variables (eg. position, momentum), Wigner functions for discrete variables (eg. Qubits, or spin) are a mess. The normal approach can only deal with discrete systems in a prime number of dimensions (IE a particle with 3 possible spin states is fine, but 6 is not.). If the number of dimensions is not prime weird extra tricks are needed.
A second issue is that the Wigner function, being equivalent to a density matrix, combines both quantum stuff and the ignorance of the observer into one object. But the ignorance of the observer should be left behind if we were trying to raise it to being ontologically fundamental, which would require some change.
Another issue with “ontologising” the Wigner function is that you need some kind of idea of what those negatives “really mean”. I spent some time thinking about “If the many worlds interpretation comes from ontologising the wavefunction, what comes from doing that to the Wigner function?” a few years ago. I never got anywhere.
Wouldn’t it also be many worlds, just with a richer set of worlds? Because with wavefunctions, your basis has to pick between conjugate pairs of variables, so your “worlds” can’t e.g. have both positions and momentums, whereas Wigner functions tensor the conjugate pairs together, so their worlds contain both positions and momentums in one.
Oops, I guess I missed this part when reading your comment. No, I meant for A to refer to the whole configuration of the universe.
Then it seems unfortunate that you illustrated it with a single example, in which A was a single (uniformly distributed) number between 0 and 1.
But it’s a generic type; A could be anything. I had the functional programming mindset where it was to be expected that the Distribution type would be composed into more complex distributions.
In my view, the big similarity is in principle of superposition. The evolution of the system in a sense may depend on the wavefunction, but it is an extremely rigid sense which requires it to be invariant to chopping up a superposition to a bunch of independent pieces, or chopping up a simple state into an extremely pathological superposition.
It’s worth emphasizing that the OP isn’t really how I originally thought of QM. One of my earliest memories was of my dad explaining quantum collapse to me, and me reinventing decoherence by asking why it couldn’t just be that you got entangled with the thing you were observing. It’s only now, years later, that I’ve come to take issue with QM.
In my mind, there’s four things that strongly distinguish QM systems from ordinary stochastic systems:
Destructive interference
Principle of least action (you could in principle have this and the next in deterministic/stochastic systems, but it doesn’t fall out of the structure the ontology as easily, without additional laws)
Preservation of information (though of course since the universe is actually quantum, this means the universe doesn’t resemble a deterministic or stochastic system at the large scale, because we have thermodynamics and neither deterministic nor stochastic systems need thermodynamics)
Pauli exclusion principle (technically you could have this in a stochastic system too, but it feels quantum-mechanical because it can be derived from fermion products being anti-symmetric, and anti-symmetry only makes sense in quantum systems)
Almost certainly this isn’t complete, since I’m mostly autodidact (got taught a bit by my dad, read standard rationalist intros to quantum, like The Sequences and Scott Aaronson, took a mathematical physics course, and coded a few qubit simulations, binged some Wikipedia and Youtube). Of these, only destructive interference really seems like an obstacle, and only a mild one.
I would say this is cruxy for me, in the sense that if I didn’t believe Truly Stochastic systems were ontologically fine, then I would take similar issue with Truly Quantum systems.