It seems like what you’re saying is: “If we have an infinite number of hypotheses, some of them must have infinetisimally small prior probability”. That doesn’t require a proof.
(I upvoted this article in spite of not learning much from it, since I love this kind of discussion. Why not read some papers about PAC learning, MDL, Algorithmic Info. Theory, or VC theory and compare/contrast for us the view of Occam implicit in each development?)
It seems like what you’re saying is: “If we have an infinite number of hypotheses, some of them must have infinetisimally small prior probability”. That doesn’t require a proof.
(I upvoted this article in spite of not learning much from it, since I love this kind of discussion. Why not read some papers about PAC learning, MDL, Algorithmic Info. Theory, or VC theory and compare/contrast for us the view of Occam implicit in each development?)