Others say chance is a physical property – a “propensity” of systems to produce certain outcomes. But this feels suspiciously like adding a mysterious force to our physics.[4] When we look closely at physical systems (leaving quantum mechanics aside for now), they often seem deterministic: if you could flip a coin exactly the same way twice, it would land the same way both times.
Don’t sideline QM: it’s highly relevant. If there are propensities, real probabilities, then they are not mysterious, they are just the way reality works. They might be unnecessary to explain many of our
practices of ordinary probablistic reasoning, but that doesn’t make them mysterious in themselves.
If you can give a map-based account of probablistic reasoning, that’s fine as far as it goes …but it doesn’t go as far as proving there are no propensities
Whatever that means , it doesn’t mean that maps can never correspond to territories. In-the-map does not imply not-in-the-territory. “Can be thought about in a certain way” does not imply “has be thought about in a certain way”.
Like latitude and longitude, chances are helpful coordinates on our mental map, not fundamental properties of reality. When we say there’s a 70% chance of rain, we’re not making claims about mysterious properties in the world.
But you could be partially making claims about the world,since propensities are logically possible...even though there is a layer of subjective ,lack-of-knowkedge-based uncertainty on top.
(And the fact that there is so much ambiguity between in-the-map probability and in-the-territory probability itself explains why there is so much confusion about QM).
Well, you can regard QM as deterministic, so long as you are willing to embrace nonlocality..but you don’t have to.
Although it is worth noting that many theories of quantum mechanics— in particular, Everettian and Bohmian quantum mechanics—are perfectly deterministic.
...only means you can.
The existence of real probabilities is still an open question, and still not closed by noticing that there is a version of probability/possibility/chance in the mind/map …because that doesn’t mean there is isn’t also a version in the territory/reality.
Bayesianism in particular doesn’t mean probability is in the mind in a sense exclusive of being in the territory.
Consider performing a Bayesian experiment in a universe with propensities. You start off with a prior of 0.5 , on indifference, that your photons will be spin up. You perform a run of a experiments,and 50% of them are spin up. So your posterior is also 0.5...which is also in the in-the-territory probability.
Credences need to be about something, but they don’t need to be about propensities. A Bayesian can prove that they have the right credences by winning bets, which is quite possible in a deterministic universe.
If the authors claim that adding randomness in the territory in classical mechanics requires making it more complex, they should also notice that for quantum mechanics, removing the probability from the territory for QM (like Bohmian mechanics) tends to make the the theories more complex.
Also, QM is not a weird edge case to be discarded at leisure, it is to the best of our knowledge a fundamental aspect of what we call reality. Sidelining it is like arguing “any substance can be divided into arbitrary small portions”—sure, as far as everyday objects such as a bottle of water are concerned, this is true to some very good approximation, but it will not convince anyone.
Also, I am not sure that for the many world interpretation, the probability of observing spin-up when looking at a mixed state is something which firmly lies in the map. From what I can tell, what is happening in MWI is that the observer will become entangled with the mixed state. From the point of view of the observer, they find themselves either in the state where they observed spin-up and spin-down, but their world model before observation as “I will find myself either in spin-up-world or spin-down-world, and my uncertainty about which of these it will be is subjective” seems to grossly misrepresent their model. They would say “One copy of myself will find itself in spin-down-world, and one in spin-up-world, and if I were to repeat this experiment to establish a frequentist probability, I would find that that the probability of each outcome is given by the coefficient of that part of the wave function to the power of two”.
So, in my opinion,
If a blackjack player wonders if a card placed face-down on the table is an ace, that is uncertainty in their map.
If someone wonders how a deterministic but chaotic physical system will evolve over time, that is also uncertainty in the map.
If someone wonders what outcome they are likely to measure in QM, that is (without adding extra epicycles) uncertainty in the territory.
If someone wonders what the evolution of a large statistical ensemble which is influenced by QM at the microscopic level (such as a real gas) is, that might be mostly uncertainty in the map if looking at very short time scales (where position and momentum uncertainty and the statistical nature of scattering cross-sections would not constrain Laplace’s daemon too much), but exists in the territory for most useful time spans.
I agree with both of you --- QM is one of our most successful physical theories, and we should absolutely take it seriously! We de-empahsized QM in the post so we could focus on the de Finetti perspective, and what it teaches us about chance in many contexts. QM is also very much worth discussing—it would just be a longer, different, more nuanced post.
It is certainly true that certain theories of QM—such as the GRW one mentioned in footnote 8 of the post --- do have chance as a fundamental part of the theory. Insofar as we assign positive probability to such theories, we should not rule out chance as being part of the world in a fundamental way.
Indeed, we tried to point out in the post that the de Finetti theorem doesn’t rule out chances, it just shows we don’t need them in order to apply our standard statistical reasoning. In many contexts—such as the first two bullet points in the comment to which I am replying—I think that the de Finetti result gives us strong evidence that we shouldn’t reify chance.
I also think—and we tried to say this in the post—that it is an open question and active debate how much this very pragmatic reduction of chance can extend to the QM context. Indeed, it might very well be that the last two bullet points above do involve chance being genuinely in the territory.
So I suspect we pretty much agree the broad point—QM definitely gives us some evidence that chances are really out there, but there are also non-chancey candidates. We tried to mention QM and indicate that things get subtle there without it distracting from the main text.
Some remarks on the other parts of the comments are below, but they are more for fun & completeness, as they get in the weeds a bit.
***
In response to the discussion of whether or not adding randomness or removing randomness makes something more complex, we didn’t make any such claim.
Complexity isn’t a super motivating property for me in thinking about fundamental physics. Though I do find the particular project of thinking about randomness in QM really interesting—here is a paper I enjoy that shows things can get pretty tricky.
I also agree that how different theories of QM interact with the constraints of special relativity (esp. locality) is very important for evaluating the theory.
With respect to the many worlds interpretation, at least Everett himself was clear that he thought his theory didn’t involve probability (though of course we don’t have to blindly agree with what he says about his version of many worlds—he could be wrong about his own theory, or we could be considering a slightly different version of many worlds). This paper of his is particularly clear about this point. At the bottom of page 18 he discusses the use of probability theory mathematically in the theory, and writes:
“Now probability theory is equivalent to measure theory mathematically, so that we can make use of it, while keeping in mind that all results should be translated back to measure theoretic language.”
Jeff Barrett, whose book I linked to in the QM footnote in the main text, and whose annotations are present in the linked document, describes the upshot of this remark (in a comment):
“The reason that Everett insists that all results be translated back to measure theoretic language is that there are, strictly speaking, no probabilities in pure wave mechanics; rather, the measure derived above provides a standard of typicality for elements in the superposition and hence for relative facts.”
In general, Everett thought “typicality” a better way to describe the norm squared amplitude of a branch in his theory. On Everett’s view, It would not be appropriate to confuse a physical quantity (typicality) and probability (the kind of thing that guides our actions in an EU way and drives our epistemology in a Bayesian way), even if they obey the same mathematics.
In general, my understanding is that in many worlds you need to add some kind of rationality principle or constraint to an agent in the theory so that you get out the Born rule probabilities, either via self-locating uncertainty (as the previous comment suggested) or via a kind of decision theoretic argument. For example, here is a paper that uses an Epistemic Separability Principle to yield the required probabilities. Here is another paper that takes the more decision theoretic approach, introducing particular axioms of rationality for the many worlds context. So while I absolutely agree that there are attractive strategies for getting probability out of many worlds, they tend to involve some rationality principles/constraints, which aren’t themselves supplied by the theory, and which make it look a bit more like the probability is in the map, in those cases. Though, of course, as an aspiring empiricist, I want my map to be very receptive to the territory. If there is some relevant structure in the territory that constraints my credences, in conjunction with some rationality principles, then that seems useful.
But a lot of these remarks are very in the weeds, and I am very open to changing my mind about any of them. It is a very subtle topic.
You did a bit more than de-emphasize it in the title!
Also:
Like latitude and longitude, chances are helpful coordinates on our mental map, not fundamental properties of reality.
“Are”?
**Insofar as we assign positive probability to such theories, we should not rule out chance as being part of the world in a fundamental way. **Indeed, we tried to point out in the post that the de Finetti theorem doesn’t rule out chances, it just shows we don’t need them in order to apply our standard statistical reasoning. In many contexts—such as the first two bullet points in the comment to which I am replying—I think that the de Finetti result gives us strong evidence that we shouldn’t reify chance.
The perennial source of confusion here is the assumption that the question is whether chance/probability is in the map or the territory… but the question sidelines the “both” option. If there were
strong evidence of mutual exclusion, of an XOR rather than IOR premise, the question would be appropriate. But there isn’t.
If there is no evidence of an XOR, no amount of evidence in favour of subjective probability is evidence against objective probability, and objective probability needs to be argued for (or against), on independent grounds. Since there is strong evidence for subjective probability, the choices are subjective+objective versus subjective only, not subjective versus objective.
(This goes right back to “probability is in the mind”)
Occams razor isn’t much help. If you assume determinism as the obvious default, objective uncertainty looks like an additional assumption...but if you assume randomness as the obvious default, then any deteministic or quasi deteministic law seems like an additional thing
In general, my understanding is that in many worlds you need to add some kind of rationality principle or constraint to an agent in the theory so that you get out the Born rule probabilities, either via self-locating uncertainty (as the previous comment suggested) or via a kind of decision theoretic argument.
There’s a purely mathematical argument for the Born rule. The tricky thing is explaining why observations have a classical basis—why observers who are entangled with a superposed system don’t go into superposition with themselves. There are multiple aspects to the measurement problem...the existence or otherwise if a fundamental measurement process, the justification the Born rule, the reason for the emergence of sharp pointer states, and reason for the appearance of a classical basis. Everett theory does rather badly on the last two.
If the authors claim that adding randomness in the territory in classical mechanics requires making it more complex, they should also notice that for quantum mechanics, removing the probability from the territory for QM (like Bohmian mechanics) tends to make the the theories more complex.
OK, but people here tend to prefer many worlds to Bohmian mechanics.. it isn’t clear that MWI is more complex … but it also isn’t clear that it is a actually simpler than the alternatives
…as it’s stated to be in the rationalsphere.
Don’t sideline QM: it’s highly relevant. If there are propensities, real probabilities, then they are not mysterious, they are just the way reality works. They might be unnecessary to explain many of our practices of ordinary probablistic reasoning, but that doesn’t make them mysterious in themselves.
If you can give a map-based account of probablistic reasoning, that’s fine as far as it goes …but it doesn’t go as far as proving there are no propensities
Whatever that means , it doesn’t mean that maps can never correspond to territories. In-the-map does not imply not-in-the-territory. “Can be thought about in a certain way” does not imply “has be thought about in a certain way”.
But you could be partially making claims about the world,since propensities are logically possible...even though there is a layer of subjective ,lack-of-knowkedge-based uncertainty on top.
(And the fact that there is so much ambiguity between in-the-map probability and in-the-territory probability itself explains why there is so much confusion about QM).
@Maxwell Peterson
Well, you can regard QM as deterministic, so long as you are willing to embrace nonlocality..but you don’t have to.
...only means you can.
The existence of real probabilities is still an open question, and still not closed by noticing that there is a version of probability/possibility/chance in the mind/map …because that doesn’t mean there is isn’t also a version in the territory/reality.
Bayesianism in particular doesn’t mean probability is in the mind in a sense exclusive of being in the territory.
Consider performing a Bayesian experiment in a universe with propensities. You start off with a prior of 0.5 , on indifference, that your photons will be spin up. You perform a run of a experiments,and 50% of them are spin up. So your posterior is also 0.5...which is also in the in-the-territory probability.
@Cubefox
Credences need to be about something, but they don’t need to be about propensities. A Bayesian can prove that they have the right credences by winning bets, which is quite possible in a deterministic universe.
Agreed.
If the authors claim that adding randomness in the territory in classical mechanics requires making it more complex, they should also notice that for quantum mechanics, removing the probability from the territory for QM (like Bohmian mechanics) tends to make the the theories more complex.
Also, QM is not a weird edge case to be discarded at leisure, it is to the best of our knowledge a fundamental aspect of what we call reality. Sidelining it is like arguing “any substance can be divided into arbitrary small portions”—sure, as far as everyday objects such as a bottle of water are concerned, this is true to some very good approximation, but it will not convince anyone.
Also, I am not sure that for the many world interpretation, the probability of observing spin-up when looking at a mixed state is something which firmly lies in the map. From what I can tell, what is happening in MWI is that the observer will become entangled with the mixed state. From the point of view of the observer, they find themselves either in the state where they observed spin-up and spin-down, but their world model before observation as “I will find myself either in spin-up-world or spin-down-world, and my uncertainty about which of these it will be is subjective” seems to grossly misrepresent their model. They would say “One copy of myself will find itself in spin-down-world, and one in spin-up-world, and if I were to repeat this experiment to establish a frequentist probability, I would find that that the probability of each outcome is given by the coefficient of that part of the wave function to the power of two”.
So, in my opinion,
If a blackjack player wonders if a card placed face-down on the table is an ace, that is uncertainty in their map.
If someone wonders how a deterministic but chaotic physical system will evolve over time, that is also uncertainty in the map.
If someone wonders what outcome they are likely to measure in QM, that is (without adding extra epicycles) uncertainty in the territory.
If someone wonders what the evolution of a large statistical ensemble which is influenced by QM at the microscopic level (such as a real gas) is, that might be mostly uncertainty in the map if looking at very short time scales (where position and momentum uncertainty and the statistical nature of scattering cross-sections would not constrain Laplace’s daemon too much), but exists in the territory for most useful time spans.
I agree with both of you --- QM is one of our most successful physical theories, and we should absolutely take it seriously! We de-empahsized QM in the post so we could focus on the de Finetti perspective, and what it teaches us about chance in many contexts. QM is also very much worth discussing—it would just be a longer, different, more nuanced post.
It is certainly true that certain theories of QM—such as the GRW one mentioned in footnote 8 of the post --- do have chance as a fundamental part of the theory. Insofar as we assign positive probability to such theories, we should not rule out chance as being part of the world in a fundamental way.
Indeed, we tried to point out in the post that the de Finetti theorem doesn’t rule out chances, it just shows we don’t need them in order to apply our standard statistical reasoning. In many contexts—such as the first two bullet points in the comment to which I am replying—I think that the de Finetti result gives us strong evidence that we shouldn’t reify chance.
I also think—and we tried to say this in the post—that it is an open question and active debate how much this very pragmatic reduction of chance can extend to the QM context. Indeed, it might very well be that the last two bullet points above do involve chance being genuinely in the territory.
So I suspect we pretty much agree the broad point—QM definitely gives us some evidence that chances are really out there, but there are also non-chancey candidates. We tried to mention QM and indicate that things get subtle there without it distracting from the main text.
Some remarks on the other parts of the comments are below, but they are more for fun & completeness, as they get in the weeds a bit.
***
In response to the discussion of whether or not adding randomness or removing randomness makes something more complex, we didn’t make any such claim.
Complexity isn’t a super motivating property for me in thinking about fundamental physics. Though I do find the particular project of thinking about randomness in QM really interesting—here is a paper I enjoy that shows things can get pretty tricky.
I also agree that how different theories of QM interact with the constraints of special relativity (esp. locality) is very important for evaluating the theory.
With respect to the many worlds interpretation, at least Everett himself was clear that he thought his theory didn’t involve probability (though of course we don’t have to blindly agree with what he says about his version of many worlds—he could be wrong about his own theory, or we could be considering a slightly different version of many worlds). This paper of his is particularly clear about this point. At the bottom of page 18 he discusses the use of probability theory mathematically in the theory, and writes:
“Now probability theory is equivalent to measure theory mathematically, so that we can make use of it, while keeping in mind that all results should be translated back to measure theoretic language.”
Jeff Barrett, whose book I linked to in the QM footnote in the main text, and whose annotations are present in the linked document, describes the upshot of this remark (in a comment):
“The reason that Everett insists that all results be translated back to measure theoretic language is that there are, strictly speaking, no probabilities in pure wave mechanics; rather, the measure derived above provides a standard of typicality for elements in the superposition and hence for relative facts.”
In general, Everett thought “typicality” a better way to describe the norm squared amplitude of a branch in his theory. On Everett’s view, It would not be appropriate to confuse a physical quantity (typicality) and probability (the kind of thing that guides our actions in an EU way and drives our epistemology in a Bayesian way), even if they obey the same mathematics.
In general, my understanding is that in many worlds you need to add some kind of rationality principle or constraint to an agent in the theory so that you get out the Born rule probabilities, either via self-locating uncertainty (as the previous comment suggested) or via a kind of decision theoretic argument. For example, here is a paper that uses an Epistemic Separability Principle to yield the required probabilities. Here is another paper that takes the more decision theoretic approach, introducing particular axioms of rationality for the many worlds context. So while I absolutely agree that there are attractive strategies for getting probability out of many worlds, they tend to involve some rationality principles/constraints, which aren’t themselves supplied by the theory, and which make it look a bit more like the probability is in the map, in those cases. Though, of course, as an aspiring empiricist, I want my map to be very receptive to the territory. If there is some relevant structure in the territory that constraints my credences, in conjunction with some rationality principles, then that seems useful.
But a lot of these remarks are very in the weeds, and I am very open to changing my mind about any of them. It is a very subtle topic.
You did a bit more than de-emphasize it in the title!
Also:
“Are”?
The perennial source of confusion here is the assumption that the question is whether chance/probability is in the map or the territory… but the question sidelines the “both” option. If there were
strong evidence of mutual exclusion, of an XOR rather than IOR premise, the question would be appropriate. But there isn’t.
If there is no evidence of an XOR, no amount of evidence in favour of subjective probability is evidence against objective probability, and objective probability needs to be argued for (or against), on independent grounds. Since there is strong evidence for subjective probability, the choices are subjective+objective versus subjective only, not subjective versus objective.
(This goes right back to “probability is in the mind”)
Occams razor isn’t much help. If you assume determinism as the obvious default, objective uncertainty looks like an additional assumption...but if you assume randomness as the obvious default, then any deteministic or quasi deteministic law seems like an additional thing
@quiet_NaN
There’s a purely mathematical argument for the Born rule. The tricky thing is explaining why observations have a classical basis—why observers who are entangled with a superposed system don’t go into superposition with themselves. There are multiple aspects to the measurement problem...the existence or otherwise if a fundamental measurement process, the justification the Born rule, the reason for the emergence of sharp pointer states, and reason for the appearance of a classical basis. Everett theory does rather badly on the last two.
OK, but people here tend to prefer many worlds to Bohmian mechanics.. it isn’t clear that MWI is more complex … but it also isn’t clear that it is a actually simpler than the alternatives …as it’s stated to be in the rationalsphere.