As I see it, probability is essentially just a measure of our ignorance, or the ignorance of any model that’s used to make predictions. An event with a probability of 0.5 implies that in half of all situations where I have information indistinguishable from the information I have now, this event will occur; in the other half of all such indistinguishable situations, it won’t happen.
For example, all I know is that I have a coin with two sides of equal weight that I plan to flip carelessly through the air until it lands on a flat surface. I’m not tracking how all the action potentials in the neurons of my motor cortex, cerebellum, and spinal cord will affect the precise twitches of individual muscle fibers as I execute the flip, nor the precise orientation of the coin prior to the flip, nor the position of every bone and muscle in my body, nor the minute air currents that might interact differently with the textures on the heads versus tails side, nor any variations in the texture of the landing surface, nor that sniper across the street who’s secretly planning to shoot the coin once it’s in the air, nor etc., etc., etc. Under the simplified model, where that’s all you know, it really will land heads half the time and tails half the time across all possible instantiations of the situation where you can’t tell any difference in the relevant initial conditions. In the reality of a deterministic universe, however, the coin (of any particular Everett branch of the multiverse) will either land heads-up or it won’t, with no in-between state that could be called “probability”.
Similarly, temperature also measures our ignorance, or rather lack of control, of the trajectories of a large number of particles. There are countless microstates that produce identical macrostates. We don’t know which microstate is currently happening, how fast and in what direction each atom is moving. We just know that the molecules in the fluid in the calorimeter are bouncing around fast enough to cause the mercury atoms in the thermometer to bounce against each other hard enough to cause the mercury to expand out to the 300K mark. But there are vigintillions of distinct ways this could be accomplished at the subatomic level, which are nevertheless indistinguishable to us at the macroscopic level. You could shoot cold water through a large pipe at 100 mph and we would still call it cold, even though the average kinetic energy of the water molecules is now equivalent to a significantly higher temperature. This is because we have control over the largest component of their motion, because we can describe it with a simple model.
To a God-level being that actually does track the universal wave function and knows (and has the ability to control) the trajectories of every particle everywhere, there is no such thing as temperature, no such thing as probability. Particles just have whatever positions and momenta they have, and events either happen or they don’t (neglecting extra nuances from QM). For those of us bound by thermodynamics, however, these same systems of particles and events are far less predictable. We can’t see all the lowest-level details, much less model them with the same precision as reality itself, much less control them with God-level orchestration. Thus, probability, temperature, etc. become necessary tools for predicting and controlling reality at the level of rational agents embedded in the physical universe, with all the ignorance and impotence that comes along with it.
As I see it, probability is essentially just a measure of our ignorance, or the ignorance of any model that’s used to make predictions. An event with a probability of 0.5 implies that in half of all situations where I have information indistinguishable from the information I have now, this event will occur; in the other half of all such indistinguishable situations, it won’t happen.
Here I think you’re mixing two different approaches. One is the Bayesian apporach: it comes down to saying probabilistic theories are normative. The question is how to reconcile that with how these theories make some predictions that don’t look normative at all: for example, saying that blackbody radiation flux scales with the fourth power of temperature seems like a concrete prediction that doesn’t have much to do with the ignorance of any particular observer. QM is even more troublesome but you don’t need to go there to begin to see some puzzles.
The second is to say that in some circumstances you’ll get a unique probability measure on an event space by requiring that the measure is invariant under the action of some symmetry group on the space. I think this is a useful meta-principle for choosing probability measures (for example, unitary symmetry of QM → Born rule), and it can get you somewhere if you combine it with Dutch book style arguments, but in practice I give probabilities on lots of events which don’t seem to have this kind of nice symmetry that die rolls or coin flips have, and I think what I’m doing there is a reasonable thing to do. I just don’t know how to explain what I’m doing or how to justify it properly.
Similarly, temperature also measures our ignorance, or rather lack of control, of the trajectories of a large number of particles… To a God-level being that actually does track the universal wave function and knows (and has the ability to control) the trajectories of every particle everywhere, there is no such thing as temperature, no such thing as probability.
The problem here is that there are plenty of physical phenomena which are probably best understood in terms of temperature even if you’re God. Phase transitions are one example of that: it’s unlikely that the “good understanding” of the superconducting phase transition doesn’t involve mentioning temperature/statistical mechanics at all, for example.
Thus, probability, temperature, etc. become necessary tools for predicting and controlling reality at the level of rational agents embedded in the physical universe, with all the ignorance and impotence that comes along with it.
I agree with this in general, but we use probability in many different senses, some of them not really connected to this problem of uncertainty. I’ve given some examples already in the comment, and you can even produce ones from mathematics: for example, plenty of analytic number theory can be summed up as trying to understand in what sense the Liouville function is random (i.e. you can model it as a “coin flip”) and how to prove that it is so.
I think none of this unfortunately answers the question of what the “epistemic status” of a probabilistic theory actually is.
As I see it, probability is essentially just a measure of our ignorance, or the ignorance of any model that’s used to make predictions. An event with a probability of 0.5 implies that in half of all situations where I have information indistinguishable from the information I have now, this event will occur; in the other half of all such indistinguishable situations, it won’t happen.
For example, all I know is that I have a coin with two sides of equal weight that I plan to flip carelessly through the air until it lands on a flat surface. I’m not tracking how all the action potentials in the neurons of my motor cortex, cerebellum, and spinal cord will affect the precise twitches of individual muscle fibers as I execute the flip, nor the precise orientation of the coin prior to the flip, nor the position of every bone and muscle in my body, nor the minute air currents that might interact differently with the textures on the heads versus tails side, nor any variations in the texture of the landing surface, nor that sniper across the street who’s secretly planning to shoot the coin once it’s in the air, nor etc., etc., etc. Under the simplified model, where that’s all you know, it really will land heads half the time and tails half the time across all possible instantiations of the situation where you can’t tell any difference in the relevant initial conditions. In the reality of a deterministic universe, however, the coin (of any particular Everett branch of the multiverse) will either land heads-up or it won’t, with no in-between state that could be called “probability”.
Similarly, temperature also measures our ignorance, or rather lack of control, of the trajectories of a large number of particles. There are countless microstates that produce identical macrostates. We don’t know which microstate is currently happening, how fast and in what direction each atom is moving. We just know that the molecules in the fluid in the calorimeter are bouncing around fast enough to cause the mercury atoms in the thermometer to bounce against each other hard enough to cause the mercury to expand out to the 300K mark. But there are vigintillions of distinct ways this could be accomplished at the subatomic level, which are nevertheless indistinguishable to us at the macroscopic level. You could shoot cold water through a large pipe at 100 mph and we would still call it cold, even though the average kinetic energy of the water molecules is now equivalent to a significantly higher temperature. This is because we have control over the largest component of their motion, because we can describe it with a simple model.
To a God-level being that actually does track the universal wave function and knows (and has the ability to control) the trajectories of every particle everywhere, there is no such thing as temperature, no such thing as probability. Particles just have whatever positions and momenta they have, and events either happen or they don’t (neglecting extra nuances from QM). For those of us bound by thermodynamics, however, these same systems of particles and events are far less predictable. We can’t see all the lowest-level details, much less model them with the same precision as reality itself, much less control them with God-level orchestration. Thus, probability, temperature, etc. become necessary tools for predicting and controlling reality at the level of rational agents embedded in the physical universe, with all the ignorance and impotence that comes along with it.
Here I think you’re mixing two different approaches. One is the Bayesian apporach: it comes down to saying probabilistic theories are normative. The question is how to reconcile that with how these theories make some predictions that don’t look normative at all: for example, saying that blackbody radiation flux scales with the fourth power of temperature seems like a concrete prediction that doesn’t have much to do with the ignorance of any particular observer. QM is even more troublesome but you don’t need to go there to begin to see some puzzles.
The second is to say that in some circumstances you’ll get a unique probability measure on an event space by requiring that the measure is invariant under the action of some symmetry group on the space. I think this is a useful meta-principle for choosing probability measures (for example, unitary symmetry of QM → Born rule), and it can get you somewhere if you combine it with Dutch book style arguments, but in practice I give probabilities on lots of events which don’t seem to have this kind of nice symmetry that die rolls or coin flips have, and I think what I’m doing there is a reasonable thing to do. I just don’t know how to explain what I’m doing or how to justify it properly.
The problem here is that there are plenty of physical phenomena which are probably best understood in terms of temperature even if you’re God. Phase transitions are one example of that: it’s unlikely that the “good understanding” of the superconducting phase transition doesn’t involve mentioning temperature/statistical mechanics at all, for example.
I agree with this in general, but we use probability in many different senses, some of them not really connected to this problem of uncertainty. I’ve given some examples already in the comment, and you can even produce ones from mathematics: for example, plenty of analytic number theory can be summed up as trying to understand in what sense the Liouville function is random (i.e. you can model it as a “coin flip”) and how to prove that it is so.
I think none of this unfortunately answers the question of what the “epistemic status” of a probabilistic theory actually is.