Occam’s razor should be on your list. Not in the “Solomonoff had the right definition of complexity” sense, but in the sense that any proper probability distribution has to integrate to 1, and so for any definition of complexity satisfying a few common sense axioms the limit of your prior probability has to go to zero as the complexity goes to infinity.
I think you’ve oversimplified the phrasing of 6 (not your fault, though; more the fault of the English language). Although your expected value for your future estimate of P(H) should be the same as your current estimate of P(H), that doesn’t imply symmetry of expected future evidence. For example, I have a very high expectation that future evidence will very slightly increase my already very strong belief that aliens are not visiting Earth; this is mostly balanced out by a very tiny expectation that future evidence will strongly decrease that belief.
for any definition of complexity satisfying a few common sense axioms the limit of your prior probability has to go to zero as the complexity goes to infinity.
What are these axioms?
Although your expected value for your future estimate of P(H) should be the same as your current estimate of P(H), that doesn’t imply symmetry of expected future evidence.
Right. In general, the distribution for your posterior probability is by no means symmetric about your prior probability.
Assuming you think only in terms of discrete options, I think the only axiom you need is that for any level of complexity k there is at least one option that complex.
Does this give one any reason to believe that, if two hypotheses are under consideration, the simpler one is a priori more likely? If not, it seems to me to be missing something too crucial to be called a formalization of Occam’s razor.
Right, you’d need more than that one axiom before you could really say you had a formulation of Occam’s Razor. I’m just making a more specific point, that whatever formulation of complexity you come up with, so long as it satisfies the axiom above, will have the property that any probability distribution over discrete outcomes must assign diminishing probability to increasingly complex hypotheses in the limit.
EDIT: actually even without that axiom, so long as you consider only discrete hypotheses and your definition of complexity maps hypotheses to a real positive number representing complexity, you will have that the mass of probability given to hypotheses more complex than x falls to zero as x goes to infinity.
Not in the “Solomonoff had the right definition of complexity” sense, but in the sense that any proper probability distribution has to integrate to 1, and so for any definition of complexity satisfying a few common sense axioms the limit of your prior probability has to go to zero as the complexity goes to infinity.
Assuming you restrict to discrete probability distributions.
Occam’s razor should be on your list. Not in the “Solomonoff had the right definition of complexity” sense, but in the sense that any proper probability distribution has to integrate to 1, and so for any definition of complexity satisfying a few common sense axioms the limit of your prior probability has to go to zero as the complexity goes to infinity.
I think you’ve oversimplified the phrasing of 6 (not your fault, though; more the fault of the English language). Although your expected value for your future estimate of P(H) should be the same as your current estimate of P(H), that doesn’t imply symmetry of expected future evidence. For example, I have a very high expectation that future evidence will very slightly increase my already very strong belief that aliens are not visiting Earth; this is mostly balanced out by a very tiny expectation that future evidence will strongly decrease that belief.
What are these axioms?
Right. In general, the distribution for your posterior probability is by no means symmetric about your prior probability.
Assuming you think only in terms of discrete options, I think the only axiom you need is that for any level of complexity k there is at least one option that complex.
EDIT: I’m wrong, you don’t even need this.
Does this give one any reason to believe that, if two hypotheses are under consideration, the simpler one is a priori more likely? If not, it seems to me to be missing something too crucial to be called a formalization of Occam’s razor.
Right, you’d need more than that one axiom before you could really say you had a formulation of Occam’s Razor. I’m just making a more specific point, that whatever formulation of complexity you come up with, so long as it satisfies the axiom above, will have the property that any probability distribution over discrete outcomes must assign diminishing probability to increasingly complex hypotheses in the limit.
EDIT: actually even without that axiom, so long as you consider only discrete hypotheses and your definition of complexity maps hypotheses to a real positive number representing complexity, you will have that the mass of probability given to hypotheses more complex than x falls to zero as x goes to infinity.
Assuming you restrict to discrete probability distributions.
Modern Bayesianism may perhaps be notable for showing the limitations of Occam’s razor—which was previously a widely accepted doctrine.