Yudkowsky says in one of his posts that since 0 and 1 for probabilities mean -∞ and +∞, you can’t just add up all the hypotheses to get one. However, I don’t see why this should necessarily follow. After all, to select one hypothesis from the hypothesis space, we must get the number of bits of evidence corresponding to the program complexity of that hypothesis.

And accordingly we don’t need to have an infinite number of proofs to choose, as many as the number of bits in the longest hypothesis is sufficient, since any longer hypotheses will compete with shorter hypotheses not for correctness but for accuracy.

Yes, in the end you can never reach a probability of 1 because you have meta level uncertainty, but that is exactly what meta level probability is, and it should have been written as a separate multiplier, because otherwise adding an infinite number of uncertain meta levels will give you a probability of 0 for each of your hypotheses.

And the probability P(H) without considering meta levels should never be 0 or 1, but the probability P(H|O) could well be, since the entire meta level is put into P(O) and therefore P(H|O) will have a known finite program complexity. That is, something like:

A=”first bit is zero”, B=”first bit is one”

C=”A or B is true”, O=”other a priori”

P(O)=~0.99

P(A|O)=1/2, P(B|O)=1/2

P(C|O)=P(A|O)+P(B|O)=1/2+1/2=(1+1/2)=1

P(C)=P(C|O)*P(O)=1*0.99=0.99

And if we talk about the second bit, there will be two more hypotheses orthogonal to the first two, on the third bit two more hypotheses, and if we talk about the first three bits, there will already be a choice of 8 multiplications of the first six hypotheses, and there will no longer be correct to ask which of 6 hypotheses is true, because there are 6, in options 8, and must be true simultaneously not one hypothesis, but at least 3.

And accordingly, for 8 hypotheses, we can also add up the probabilities as 8 times ^{1}⁄_{8} and end up with 1. Or we can write it as 1/2+1/4+1/8+1/8=1, but of course this is only possible if we can count the number of bits in the hypotheses, decompose them into these bits and determine the intersections, so as not to count the same bits repeatedly.

Yudkowsky says in one of his posts that since 0 and 1 for probabilities mean -∞ and +∞, you can’t just add up all the hypotheses to get one. However, I don’t see why this should necessarily follow. After all, to select one hypothesis from the hypothesis space, we must get the number of bits of evidence corresponding to the program complexity of that hypothesis.

And accordingly we don’t need to have an infinite number of proofs to choose, as many as the number of bits in the longest hypothesis is sufficient, since any longer hypotheses will compete with shorter hypotheses not for correctness but for accuracy.

Yes, in the end you can never reach a probability of 1 because you have meta level uncertainty, but that is exactly what meta level probability is, and it should have been written as a separate multiplier, because otherwise adding an infinite number of uncertain meta levels will give you a probability of 0 for each of your hypotheses.

And the probability P(H) without considering meta levels should never be 0 or 1, but the probability P(H|O) could well be, since the entire meta level is put into P(O) and therefore P(H|O) will have a known finite program complexity. That is, something like:

A=”first bit is zero”, B=”first bit is one”

C=”A or B is true”, O=”other a priori”

P(O)=~0.99

P(A|O)=1/2, P(B|O)=1/2

P(C|O)=P(A|O)+P(B|O)=1/2+1/2=(1+1/2)=1

P(C)=P(C|O)*P(O)=1*0.99=0.99

And if we talk about the second bit, there will be two more hypotheses orthogonal to the first two, on the third bit two more hypotheses, and if we talk about the first three bits, there will already be a choice of 8 multiplications of the first six hypotheses, and there will no longer be correct to ask which of 6 hypotheses is true, because there are 6, in options 8, and must be true simultaneously not one hypothesis, but at least 3.

And accordingly, for 8 hypotheses, we can also add up the probabilities as 8 times

^{1}⁄_{8}and end up with 1. Or we can write it as 1/2+1/4+1/8+1/8=1, but of course this is only possible if we can count the number of bits in the hypotheses, decompose them into these bits and determine the intersections, so as not to count the same bits repeatedly.