I agree with both of you --- QM is one of our most successful physical theories, and we should absolutely take it seriously! We de-empahsized QM in the post so we could focus on the de Finetti perspective, and what it teaches us about chance in many contexts. QM is also very much worth discussing—it would just be a longer, different, more nuanced post.
It is certainly true that certain theories of QM—such as the GRW one mentioned in footnote 8 of the post --- do have chance as a fundamental part of the theory. Insofar as we assign positive probability to such theories, we should not rule out chance as being part of the world in a fundamental way.
Indeed, we tried to point out in the post that the de Finetti theorem doesn’t rule out chances, it just shows we don’t need them in order to apply our standard statistical reasoning. In many contexts—such as the first two bullet points in the comment to which I am replying—I think that the de Finetti result gives us strong evidence that we shouldn’t reify chance.
I also think—and we tried to say this in the post—that it is an open question and active debate how much this very pragmatic reduction of chance can extend to the QM context. Indeed, it might very well be that the last two bullet points above do involve chance being genuinely in the territory.
So I suspect we pretty much agree the broad point—QM definitely gives us some evidence that chances are really out there, but there are also non-chancey candidates. We tried to mention QM and indicate that things get subtle there without it distracting from the main text.
Some remarks on the other parts of the comments are below, but they are more for fun & completeness, as they get in the weeds a bit.
***
In response to the discussion of whether or not adding randomness or removing randomness makes something more complex, we didn’t make any such claim.
Complexity isn’t a super motivating property for me in thinking about fundamental physics. Though I do find the particular project of thinking about randomness in QM really interesting—here is a paper I enjoy that shows things can get pretty tricky.
I also agree that how different theories of QM interact with the constraints of special relativity (esp. locality) is very important for evaluating the theory.
With respect to the many worlds interpretation, at least Everett himself was clear that he thought his theory didn’t involve probability (though of course we don’t have to blindly agree with what he says about his version of many worlds—he could be wrong about his own theory, or we could be considering a slightly different version of many worlds). This paper of his is particularly clear about this point. At the bottom of page 18 he discusses the use of probability theory mathematically in the theory, and writes:
“Now probability theory is equivalent to measure theory mathematically, so that we can make use of it, while keeping in mind that all results should be translated back to measure theoretic language.”
Jeff Barrett, whose book I linked to in the QM footnote in the main text, and whose annotations are present in the linked document, describes the upshot of this remark (in a comment):
“The reason that Everett insists that all results be translated back to measure theoretic language is that there are, strictly speaking, no probabilities in pure wave mechanics; rather, the measure derived above provides a standard of typicality for elements in the superposition and hence for relative facts.”
In general, Everett thought “typicality” a better way to describe the norm squared amplitude of a branch in his theory. On Everett’s view, It would not be appropriate to confuse a physical quantity (typicality) and probability (the kind of thing that guides our actions in an EU way and drives our epistemology in a Bayesian way), even if they obey the same mathematics.
In general, my understanding is that in many worlds you need to add some kind of rationality principle or constraint to an agent in the theory so that you get out the Born rule probabilities, either via self-locating uncertainty (as the previous comment suggested) or via a kind of decision theoretic argument. For example, here is a paper that uses an Epistemic Separability Principle to yield the required probabilities. Here is another paper that takes the more decision theoretic approach, introducing particular axioms of rationality for the many worlds context. So while I absolutely agree that there are attractive strategies for getting probability out of many worlds, they tend to involve some rationality principles/constraints, which aren’t themselves supplied by the theory, and which make it look a bit more like the probability is in the map, in those cases. Though, of course, as an aspiring empiricist, I want my map to be very receptive to the territory. If there is some relevant structure in the territory that constraints my credences, in conjunction with some rationality principles, then that seems useful.
But a lot of these remarks are very in the weeds, and I am very open to changing my mind about any of them. It is a very subtle topic.
You did a bit more than de-emphasize it in the title!
Also:
Like latitude and longitude, chances are helpful coordinates on our mental map, not fundamental properties of reality.
“Are”?
**Insofar as we assign positive probability to such theories, we should not rule out chance as being part of the world in a fundamental way. **Indeed, we tried to point out in the post that the de Finetti theorem doesn’t rule out chances, it just shows we don’t need them in order to apply our standard statistical reasoning. In many contexts—such as the first two bullet points in the comment to which I am replying—I think that the de Finetti result gives us strong evidence that we shouldn’t reify chance.
The perennial source of confusion here is the assumption that the question is whether chance/probability is in the map or the territory… but the question sidelines the “both” option. If there were
strong evidence of mutual exclusion, of an XOR rather than IOR premise, the question would be appropriate. But there isn’t.
If there is no evidence of an XOR, no amount of evidence in favour of subjective probability is evidence against objective probability, and objective probability needs to be argued for (or against), on independent grounds. Since there is strong evidence for subjective probability, the choices are subjective+objective versus subjective only, not subjective versus objective.
(This goes right back to “probability is in the mind”)
Occams razor isn’t much help. If you assume determinism as the obvious default, objective uncertainty looks like an additional assumption...but if you assume randomness as the obvious default, then any deteministic or quasi deteministic law seems like an additional thing
In general, my understanding is that in many worlds you need to add some kind of rationality principle or constraint to an agent in the theory so that you get out the Born rule probabilities, either via self-locating uncertainty (as the previous comment suggested) or via a kind of decision theoretic argument.
There’s a purely mathematical argument for the Born rule. The tricky thing is explaining why observations have a classical basis—why observers who are entangled with a superposed system don’t go into superposition with themselves. There are multiple aspects to the measurement problem...the existence or otherwise if a fundamental measurement process, the justification the Born rule, the reason for the emergence of sharp pointer states, and reason for the appearance of a classical basis. Everett theory does rather badly on the last two.
If the authors claim that adding randomness in the territory in classical mechanics requires making it more complex, they should also notice that for quantum mechanics, removing the probability from the territory for QM (like Bohmian mechanics) tends to make the the theories more complex.
OK, but people here tend to prefer many worlds to Bohmian mechanics.. it isn’t clear that MWI is more complex … but it also isn’t clear that it is a actually simpler than the alternatives
…as it’s stated to be in the rationalsphere.
I agree with both of you --- QM is one of our most successful physical theories, and we should absolutely take it seriously! We de-empahsized QM in the post so we could focus on the de Finetti perspective, and what it teaches us about chance in many contexts. QM is also very much worth discussing—it would just be a longer, different, more nuanced post.
It is certainly true that certain theories of QM—such as the GRW one mentioned in footnote 8 of the post --- do have chance as a fundamental part of the theory. Insofar as we assign positive probability to such theories, we should not rule out chance as being part of the world in a fundamental way.
Indeed, we tried to point out in the post that the de Finetti theorem doesn’t rule out chances, it just shows we don’t need them in order to apply our standard statistical reasoning. In many contexts—such as the first two bullet points in the comment to which I am replying—I think that the de Finetti result gives us strong evidence that we shouldn’t reify chance.
I also think—and we tried to say this in the post—that it is an open question and active debate how much this very pragmatic reduction of chance can extend to the QM context. Indeed, it might very well be that the last two bullet points above do involve chance being genuinely in the territory.
So I suspect we pretty much agree the broad point—QM definitely gives us some evidence that chances are really out there, but there are also non-chancey candidates. We tried to mention QM and indicate that things get subtle there without it distracting from the main text.
Some remarks on the other parts of the comments are below, but they are more for fun & completeness, as they get in the weeds a bit.
***
In response to the discussion of whether or not adding randomness or removing randomness makes something more complex, we didn’t make any such claim.
Complexity isn’t a super motivating property for me in thinking about fundamental physics. Though I do find the particular project of thinking about randomness in QM really interesting—here is a paper I enjoy that shows things can get pretty tricky.
I also agree that how different theories of QM interact with the constraints of special relativity (esp. locality) is very important for evaluating the theory.
With respect to the many worlds interpretation, at least Everett himself was clear that he thought his theory didn’t involve probability (though of course we don’t have to blindly agree with what he says about his version of many worlds—he could be wrong about his own theory, or we could be considering a slightly different version of many worlds). This paper of his is particularly clear about this point. At the bottom of page 18 he discusses the use of probability theory mathematically in the theory, and writes:
“Now probability theory is equivalent to measure theory mathematically, so that we can make use of it, while keeping in mind that all results should be translated back to measure theoretic language.”
Jeff Barrett, whose book I linked to in the QM footnote in the main text, and whose annotations are present in the linked document, describes the upshot of this remark (in a comment):
“The reason that Everett insists that all results be translated back to measure theoretic language is that there are, strictly speaking, no probabilities in pure wave mechanics; rather, the measure derived above provides a standard of typicality for elements in the superposition and hence for relative facts.”
In general, Everett thought “typicality” a better way to describe the norm squared amplitude of a branch in his theory. On Everett’s view, It would not be appropriate to confuse a physical quantity (typicality) and probability (the kind of thing that guides our actions in an EU way and drives our epistemology in a Bayesian way), even if they obey the same mathematics.
In general, my understanding is that in many worlds you need to add some kind of rationality principle or constraint to an agent in the theory so that you get out the Born rule probabilities, either via self-locating uncertainty (as the previous comment suggested) or via a kind of decision theoretic argument. For example, here is a paper that uses an Epistemic Separability Principle to yield the required probabilities. Here is another paper that takes the more decision theoretic approach, introducing particular axioms of rationality for the many worlds context. So while I absolutely agree that there are attractive strategies for getting probability out of many worlds, they tend to involve some rationality principles/constraints, which aren’t themselves supplied by the theory, and which make it look a bit more like the probability is in the map, in those cases. Though, of course, as an aspiring empiricist, I want my map to be very receptive to the territory. If there is some relevant structure in the territory that constraints my credences, in conjunction with some rationality principles, then that seems useful.
But a lot of these remarks are very in the weeds, and I am very open to changing my mind about any of them. It is a very subtle topic.
You did a bit more than de-emphasize it in the title!
Also:
“Are”?
The perennial source of confusion here is the assumption that the question is whether chance/probability is in the map or the territory… but the question sidelines the “both” option. If there were
strong evidence of mutual exclusion, of an XOR rather than IOR premise, the question would be appropriate. But there isn’t.
If there is no evidence of an XOR, no amount of evidence in favour of subjective probability is evidence against objective probability, and objective probability needs to be argued for (or against), on independent grounds. Since there is strong evidence for subjective probability, the choices are subjective+objective versus subjective only, not subjective versus objective.
(This goes right back to “probability is in the mind”)
Occams razor isn’t much help. If you assume determinism as the obvious default, objective uncertainty looks like an additional assumption...but if you assume randomness as the obvious default, then any deteministic or quasi deteministic law seems like an additional thing
@quiet_NaN
There’s a purely mathematical argument for the Born rule. The tricky thing is explaining why observations have a classical basis—why observers who are entangled with a superposed system don’t go into superposition with themselves. There are multiple aspects to the measurement problem...the existence or otherwise if a fundamental measurement process, the justification the Born rule, the reason for the emergence of sharp pointer states, and reason for the appearance of a classical basis. Everett theory does rather badly on the last two.
OK, but people here tend to prefer many worlds to Bohmian mechanics.. it isn’t clear that MWI is more complex … but it also isn’t clear that it is a actually simpler than the alternatives …as it’s stated to be in the rationalsphere.