What is there in my account which should tell me how these ratios behave?
You responded positively to my suggestion that we could phrase this notion of “overconfidence” as “failure to update on other people’s beliefs,” indicating that you know how to update on other people’s beliefs. At the very least, this requires some rough quantitative understanding of the players in Bayes formula, which you don’t seem to have.
If overconfidence is not “failure to update on other people’s beliefs,” then what is it?
Here’s the abbreviated version of the conversation that led us here (right?).
S: God exists with very low probability, less that one in a zillion.
U: No, you are being overconfident. After all, billions of people believe in God, you need to take that into account somehow. Surely the probability is greater than one in a billion.
S: OK I agree that the fact that billions of people believing it constitutes evidence, but surely not evidence so strong as to get from 1-in-a-zillion to 1-in-a-billion.
Now what? Bayes theorem provides a mathematical formalism for relating evidence to probabilities, but you are saying that all four quantities in the relevant Bayes formula are too poorly understood for it to be of use. So what’s an alternative way to arrive at your one-in-a-billion figure? Or are you willing to withdraw your accusation that I’m being overconfident?
I did not say that “all four quantities in the relevant Bayes formula are too poorly understood for it to be of use.” Note that I explicitly asserted that your fourth ratio tends to infinity, and that your first one likely does as well.
If you read the linked comment thread and the Scientology example, that should make it clear why I think that the evidence might well be strong enough to go from 1 in a zillion to 1 in a billion. In fact, that should even be clear from my example of the random 80 digit binary number. Suppose instead of telling you that I chose the number randomly, I said, “I may or may not have chosen this number randomly.” This would be merely raising the possibility—the possibility of something which has a prior of 2^-80. But if I then went on to say that I had indeed chosen it randomly, you would not have therefore called me a liar, while you would do this, if I now chose another random 80 digit number and said that it was the same one. This shows that even raising the possibility provides almost all the evidence necessary—it brings the probability that I chose the number randomly all the way from 2^-80 up to some ordinary probability, or from “1 in a zillion” to something significantly above one in a billion.
More is involved in the case of belief, but I need to be sure that you get this point first.
For each 80-digit binary number X, let N(X) be the assertion “Unknowns picked an 80-digit number at random, and it was X.” In my ledger of probabilities, I dutifully fill in, for each of these statements X, 2^{-80} in the P column. Now for a particular 80-digit number Y, I am told that “Unknowns claims he picked an 80-digit number at random, and it was Y”—call that statement U(Y) -- and am asked for P(N(Y)|U(Y)).
My answer: pretty high by Bayes formula. P(U|N(Y)) is pretty high because Unknowns is trustworthy, and my ledger has P(U(Y)) = number on the same order as two-to-the-minus-eighty. (Caveat: P(U(Y)) is a lot higher for highly structured things like the sequence of all 1′s. But for the vast majority of Y I have P(U(Y)) = 2^-80 times something between (say) 10^-1 and 10^-6). So P(N(Y)|U(Y)) = P(U(Y)|N(Y)) x [P(N(Y))/P(U(Y))] is a big probability times a medium-sized probability
What’s your answer?
Reincarnation is explained to me, and I am asked for my opinion of how likely it is. I respond with P(R), a good faith estimate based on my experience and judgement. I am then told that hundreds of millions of people believe in reincarnation—call that statement B, and assume that I was ignorant of it before—and am asked for P(R|B). Your claim is that no matter how small P(R) is, P(R|B) should be larger than some threshold t. Correct?
Some manipulation with Bayes formula shows that your claim (what I understand to be your claim) is equivalent to this inequality:
P(B) < P(R) / t
That is, I am “overconfident” if I think that the probability of someone believing in reincarnation is larger than some fixed multiple of the probability that reincarnation is actually true. Moreover, though I assume (sic) you think t is sensitive to the quantity “hundreds of millions”—e.g. that it would be smaller if it were just “hundreds”—you do not think that t is sensitive to the statement R. R could be replaced by another religious claim, or by the claim that I just flipped a coin 80 times and the sequence of heads and tails was [whatever].
My position: I think it’s perfectly reasonable to assume that P(B) is quite a lot larger than P(R). What’s your position?
Your analysis is basically correct, i.e. I think it is overconfident to say that the probability P(B) is greater than P(R) by more than a certain factor, in particular because if you make it much greater, there is basically no way for you to be well calibrated in your opinions—because you are just as human as the people who believe those things. More on that later.
For now, I would like to see your response to question on my comment to komponisto (i.e. how many 1′s do you wait for.)
I have been using “now you are saying” as short for “now I understand you to be saying.” I think this may be causing confusion, and I’ll try write more carefully.
Ugly and condescending of me, beg your pardon.
You responded positively to my suggestion that we could phrase this notion of “overconfidence” as “failure to update on other people’s beliefs,” indicating that you know how to update on other people’s beliefs. At the very least, this requires some rough quantitative understanding of the players in Bayes formula, which you don’t seem to have.
If overconfidence is not “failure to update on other people’s beliefs,” then what is it?
Here’s the abbreviated version of the conversation that led us here (right?).
S: God exists with very low probability, less that one in a zillion.
U: No, you are being overconfident. After all, billions of people believe in God, you need to take that into account somehow. Surely the probability is greater than one in a billion.
S: OK I agree that the fact that billions of people believing it constitutes evidence, but surely not evidence so strong as to get from 1-in-a-zillion to 1-in-a-billion.
Now what? Bayes theorem provides a mathematical formalism for relating evidence to probabilities, but you are saying that all four quantities in the relevant Bayes formula are too poorly understood for it to be of use. So what’s an alternative way to arrive at your one-in-a-billion figure? Or are you willing to withdraw your accusation that I’m being overconfident?
I did not say that “all four quantities in the relevant Bayes formula are too poorly understood for it to be of use.” Note that I explicitly asserted that your fourth ratio tends to infinity, and that your first one likely does as well.
If you read the linked comment thread and the Scientology example, that should make it clear why I think that the evidence might well be strong enough to go from 1 in a zillion to 1 in a billion. In fact, that should even be clear from my example of the random 80 digit binary number. Suppose instead of telling you that I chose the number randomly, I said, “I may or may not have chosen this number randomly.” This would be merely raising the possibility—the possibility of something which has a prior of 2^-80. But if I then went on to say that I had indeed chosen it randomly, you would not have therefore called me a liar, while you would do this, if I now chose another random 80 digit number and said that it was the same one. This shows that even raising the possibility provides almost all the evidence necessary—it brings the probability that I chose the number randomly all the way from 2^-80 up to some ordinary probability, or from “1 in a zillion” to something significantly above one in a billion.
More is involved in the case of belief, but I need to be sure that you get this point first.
Let’s consider two situations:
For each 80-digit binary number X, let N(X) be the assertion “Unknowns picked an 80-digit number at random, and it was X.” In my ledger of probabilities, I dutifully fill in, for each of these statements X, 2^{-80} in the P column. Now for a particular 80-digit number Y, I am told that “Unknowns claims he picked an 80-digit number at random, and it was Y”—call that statement U(Y) -- and am asked for P(N(Y)|U(Y)).
My answer: pretty high by Bayes formula. P(U|N(Y)) is pretty high because Unknowns is trustworthy, and my ledger has P(U(Y)) = number on the same order as two-to-the-minus-eighty. (Caveat: P(U(Y)) is a lot higher for highly structured things like the sequence of all 1′s. But for the vast majority of Y I have P(U(Y)) = 2^-80 times something between (say) 10^-1 and 10^-6). So P(N(Y)|U(Y)) = P(U(Y)|N(Y)) x [P(N(Y))/P(U(Y))] is a big probability times a medium-sized probability
What’s your answer?
Reincarnation is explained to me, and I am asked for my opinion of how likely it is. I respond with P(R), a good faith estimate based on my experience and judgement. I am then told that hundreds of millions of people believe in reincarnation—call that statement B, and assume that I was ignorant of it before—and am asked for P(R|B). Your claim is that no matter how small P(R) is, P(R|B) should be larger than some threshold t. Correct?
Some manipulation with Bayes formula shows that your claim (what I understand to be your claim) is equivalent to this inequality:
P(B) < P(R) / t
That is, I am “overconfident” if I think that the probability of someone believing in reincarnation is larger than some fixed multiple of the probability that reincarnation is actually true. Moreover, though I assume (sic) you think t is sensitive to the quantity “hundreds of millions”—e.g. that it would be smaller if it were just “hundreds”—you do not think that t is sensitive to the statement R. R could be replaced by another religious claim, or by the claim that I just flipped a coin 80 times and the sequence of heads and tails was [whatever].
My position: I think it’s perfectly reasonable to assume that P(B) is quite a lot larger than P(R). What’s your position?
Your analysis is basically correct, i.e. I think it is overconfident to say that the probability P(B) is greater than P(R) by more than a certain factor, in particular because if you make it much greater, there is basically no way for you to be well calibrated in your opinions—because you are just as human as the people who believe those things. More on that later.
For now, I would like to see your response to question on my comment to komponisto (i.e. how many 1′s do you wait for.)
I have been using “now you are saying” as short for “now I understand you to be saying.” I think this may be causing confusion, and I’ll try write more carefully.
More soon.