I agree that your position is analogous to “shutting up and multiplying.” But in fact, Eliezer may have been wrong about that in general—see the Lifespan Dilemma—because people’s utility functions are likely not unbounded.
In your case, I agree with shutting up and multiplying when we have a way to calculate the probabilities. In this case, we don’t, so we can’t do it. If you had a known probability (see cousin_it’s comment on the possible trillions of variants of Islam) of one in a trillion, then I would agree with walking away after seeing 30 1′s, regardless of the emotional effect of this.
But in reality, we have no such known probability. The result is that you are going to have to use some base rate: “things that people believe” or more accurately, “strange things that people believe” or whatever. In any case, whatever base rate you use, it will not have a probability anywhere near 10^-20 (i.e. more than 1 in 10^20 strange beliefs is true etc.)
My real point about the fear is that your brain doesn’t work the way your probabilities do—even if you say you are that certain, your brain isn’t. And if we had calculated the probabilities, you would be justified in ignoring your brain. But in fact, since we haven’t, your brain is more right than you are in this case. It is less certain precisely because you are simply not justified in being that certain.
I agree that your position is analogous to “shutting up and multiplying.” But in fact, Eliezer may have been wrong about that in general—see the Lifespan Dilemma—because people’s utility functions are likely not unbounded.
In your case, I agree with shutting up and multiplying when we have a way to calculate the probabilities. In this case, we don’t, so we can’t do it. If you had a known probability (see cousin_it’s comment on the possible trillions of variants of Islam) of one in a trillion, then I would agree with walking away after seeing 30 1′s, regardless of the emotional effect of this.
But in reality, we have no such known probability. The result is that you are going to have to use some base rate: “things that people believe” or more accurately, “strange things that people believe” or whatever. In any case, whatever base rate you use, it will not have a probability anywhere near 10^-20 (i.e. more than 1 in 10^20 strange beliefs is true etc.)
My real point about the fear is that your brain doesn’t work the way your probabilities do—even if you say you are that certain, your brain isn’t. And if we had calculated the probabilities, you would be justified in ignoring your brain. But in fact, since we haven’t, your brain is more right than you are in this case. It is less certain precisely because you are simply not justified in being that certain.