I think after somewhere between 30 and 300 coin flips, I would convert. With more thought and more details about what package of claims is meant by “Islam,” I could give you a better estimate. Escape routes that I’m not taking: I would start to suspect Omega was pulling my leg, I would start to suspect that I was insane, I would start to suspect that everything I knew was wrong, including the tenets of Islam. If answers like these are copouts—if Omega is so reliable, and I am so sane, and so on—then it doesn’t seem like much of a bullet to bite to say “yes, 2^-30 is very small but it is still larger than 2^-66; yes something very unlikely has happened but not as unlikely as Islam”
Also, “there are such numbers” is very far from “we should use such numbers as probabilities when talking about claims that many people think are true.” The latter is an extremely strong claim and would therefore need extremely strong evidence before being acceptable.
If you’re expressing doubts about numbers being a good measure of beliefs, I’m totally with you! But we only need strong evidence for something to be acceptable if there are some alternatives—sometimes you’re stuck with a bad option. Somebody’s handed us a mathematical formalism for talking about probabilities, and it works pretty well. But it has a funny aspect: we can take a handful of medium-sized probabilities, multiply them together, and the result is a tiny tiny probability. Can anything be as unlikely as the formalism says 66 heads in a row is? I’m not saying you should say “yes,” but if your response is “well, whenever something that small comes up in practice, I’ll just round up,” that’s a patch that is going to spring leaks.
yes, 2^-30 is very small but it is still larger than 2^-66; yes something very unlikely has happened but not as unlikely as Islam.
Originally I didn’t intend to bring up Pascal’s Wager type considerations here because I thought it would just confuse the issue of the probability. But I’ve rethought this—actually this issue could help to show just how strong your beliefs are in reality.
Suppose you had said in advance that the probability of Islam was 10^-20. Then you had this experience, but the machine was shut off after 30 1′s ( a chance of one in a billion.) The chance that Islam is true is now one in a hundred billion, updated from your prior.
If this actually happened to you, and you walked away and did not convert, would you have some fear of being condemned to hell for seeing this and not converting? Even a little bit of fear? If you would, then your probability that Islam is true must be much higher than 10^-20, since we’re not afraid of things that have a one in a hundred billion chance of happening.
If this actually happened to you, and you walked away and did not convert, would you have some fear of being condemned to hell for seeing this and not converting? Even a little bit of fear? If you would, then your probability that Islam is true must be much higher than 10^-20, since we’re not afraid of things that have a one in a hundred billion chance of happening.
This is false.
I must confess that I am sometimes afraid that ghosts will jump out of the shadows and attack me at night, and I would assign a much lower chance of that happening. I have also been afraid of velociraptors. Fear is frequently irrational.
You are technically correct. My actual point was that your brain does not accept that the probability is that low. And as I stated in one of the replies, you might in some cases have reasons to say your brain is wrong… just not in this case. No one here has given any reason to think that.
It’s good you managed some sort of answer to this. However, 30 − 300 is quite a wide range; from 1 in 10^9 to 1 in 10^90. If you’re going to hope for any sort of calibration at all in using numbers like this, you’re going to have to much more precise...
I wasn’t expressing doubts about numbers being a measure of beliefs (although you could certainly question this as well), but about extreme numbers being a measure of our beliefs, which do not seem able to be that extreme. Yes, if you have a large number of independent probabilities, the result can be extreme. And supposedly, the basis for saying that Islam (or reincarnation, or whatever) is very improbable would be the complexity of the claim. But who has really determined how much complexity it has? As I pointed out elsewhere (on the “Believable Bible” comment thread), a few statements, if we knew them to be true, would justify Islam or any other such thing. Which particular statements would we need, and how complex are those statements, really? No one has determined them to any degree of precision, and until they do, you have to use something like a base rate. Just as astronomers start out with fairly high probabilities for the collision of near-earth asteroids, and only end up with low probabilities after very careful calculation, you would have to start out with a fairly high prior for Islam, or reincarnation, or whatever, and you would only be justified in holding an extreme probability after careful calculation… which I don’t believe you’ve done. Certainly I haven’t.
Apart from the complexity, there is also the issue of evidence. We’ve been assuming all along that there is no evidence for Islam, or reincarnation, or whatever. Certainly it’s true that there isn’t much. But that there is literally no evidence for such things simply isn’t so. The main thing is that we aren’t motivated to look at the little evidence that there is. But if you intend to assign probabilities to that degree of precision, you are going to have to take into account every speck of evidence.
I thought the salient feature of Islam was that many people believed it, not that it has less complexity than I thought, or more evidence in its favor than I thought. That might be, but I’m not interested in discussing it.
I don’t “feel” beliefs strongly or weakly. Sometimes probability calculations help me with fear and other emotions, sometimes they don’t. Again, I’m not interested in discussing it.
So tell me something about how important it is that many people believe in Islam.
I’m not interested in discussing Islam either… those points apply to anything that people believe. But that’s why it’s relevant to the question of belief: if you take something that people don’t believe, it can be arbitrarily complex, or 100% lacking in evidence (like Russell’s teapot), but things that people believe do not have these properties.
It’s not important how many people believe it. It could be just 50 people and the probability would not be much different (as long as the belief was logically consistent with the fact that just a few people believed it.)
So tell me why. By “complex” do you just mean “low probability,” or some notion from information theory? How did you come to believe that people cannot believe things that are too complex?
I just realized that you may have misunderstood my original point completely. Otherwise you wouldn’t have said this: “I thought the salient feature of Islam was that many people believed it, not that it has less complexity than I thought, or more evidence in its favor than I thought.”
I only used the idea of complexity because that was komponisto’s criterion for the low probability of such claims. The basic idea is people believe things that their priors say do not have too low a probability: but as I showed in the post on Occam’s razor, everyone’s prior is a kind of simplicity prior, even if they are not all identical (nor necessarily particularly related to information theory or whatever.)
Basically, a probability is determined by the prior and by the evidence that it is updated according to. The only reason things are more probable if people believe them is that a person’s belief indicates that there is some human prior according to which the thing is not too improbable, and some evidence and way of updating that can give the thing a reasonable probability. So other people’s beliefs are evidence for us only because they stand in for the other people’s priors and evidence. So it’s not that it is “important that many people believe” apart from the factors that give it probability: the belief is just a sign that those factors are there.
Going back the distinction you didn’t like, between a fixed probability device and a real world claim, a fixed probability device would be a situation where the prior and the evidence is completely fixed and known: with the example I used before, let there be a lottery that has a known probability of one in a trillion. Then since the prior and the evidence are already known, the probability is still one in a trillion, even if someone says he is definitely going to win it.
In a real world claim, on the other hand, the priors are not well known, and the evidence is not well known. And if I find out that someone believes it, I immediately know that there are humanly possible priors and evidence that can lead to that belief, which makes it much more probable even for me than it would be otherwise.
If I find out that … I know that … which makes it much more probable that …
This sounds like you are updating. We have a formula for what happens when you update, and it indeed says that given evidence, something becomes more probable. You are saying that it becomes much more probable. What quantity in Bayes formula seems especially large to you, and why?
In other words, as I said before, the probability that people believe something shouldn’t be that much more than the probability that the thing is true.
The probability that people will believe a long conjunction is less probable than they will believe one part of the conjunction (because in order to believe both parts, they have to believe each part. In other words, for the same reason the conjunction fallacy is a fallacy.)
The conjunction fallacy is the assignment of a higher probability to some statement of the form A&B than to the statement A. It is well established that for certain kinds of A and B, this happens.
The fallacy in your proof that this cannot happen is that you have misstated what the conjunction fallacy is.
My point in mentioning it is that people committing the fallacy believe a logical impossibility. You can’t get much more improbable than a logical impossibility. But the conjunction fallacy experiments demonstrate that is common to believe such things.
Therefore, the improbability of a statement does not imply the improbability of someone believing it. This refutes your contention that “the probability that people believe something shouldn’t be that much more than the probability that the thing is true.” The possible difference between the two is demonstrably larger than the range of improbabilities that people can intuitively grasp.
In that case I am misunderstanding Wei Dai’s point. He says that complexity considerations alone can’t tell you that probability is small, because complexity appears in the numerator and the denominator. I will need to see more math (which I guess cousin it is taking care of) before understanding and agreeing with this point. But even granting it I don’t see how it implies that P(many believe H)/P(H) is for all H less than one billion.
I think after somewhere between 30 and 300 coin flips, I would convert. With more thought and more details about what package of claims is meant by “Islam,” I could give you a better estimate. Escape routes that I’m not taking: I would start to suspect Omega was pulling my leg, I would start to suspect that I was insane, I would start to suspect that everything I knew was wrong, including the tenets of Islam. If answers like these are copouts—if Omega is so reliable, and I am so sane, and so on—then it doesn’t seem like much of a bullet to bite to say “yes, 2^-30 is very small but it is still larger than 2^-66; yes something very unlikely has happened but not as unlikely as Islam”
If you’re expressing doubts about numbers being a good measure of beliefs, I’m totally with you! But we only need strong evidence for something to be acceptable if there are some alternatives—sometimes you’re stuck with a bad option. Somebody’s handed us a mathematical formalism for talking about probabilities, and it works pretty well. But it has a funny aspect: we can take a handful of medium-sized probabilities, multiply them together, and the result is a tiny tiny probability. Can anything be as unlikely as the formalism says 66 heads in a row is? I’m not saying you should say “yes,” but if your response is “well, whenever something that small comes up in practice, I’ll just round up,” that’s a patch that is going to spring leaks.
Another point, regarding this:
Originally I didn’t intend to bring up Pascal’s Wager type considerations here because I thought it would just confuse the issue of the probability. But I’ve rethought this—actually this issue could help to show just how strong your beliefs are in reality.
Suppose you had said in advance that the probability of Islam was 10^-20. Then you had this experience, but the machine was shut off after 30 1′s ( a chance of one in a billion.) The chance that Islam is true is now one in a hundred billion, updated from your prior.
If this actually happened to you, and you walked away and did not convert, would you have some fear of being condemned to hell for seeing this and not converting? Even a little bit of fear? If you would, then your probability that Islam is true must be much higher than 10^-20, since we’re not afraid of things that have a one in a hundred billion chance of happening.
This is false.
I must confess that I am sometimes afraid that ghosts will jump out of the shadows and attack me at night, and I would assign a much lower chance of that happening. I have also been afraid of velociraptors. Fear is frequently irrational.
You are technically correct. My actual point was that your brain does not accept that the probability is that low. And as I stated in one of the replies, you might in some cases have reasons to say your brain is wrong… just not in this case. No one here has given any reason to think that.
It’s good you managed some sort of answer to this. However, 30 − 300 is quite a wide range; from 1 in 10^9 to 1 in 10^90. If you’re going to hope for any sort of calibration at all in using numbers like this, you’re going to have to much more precise...
I wasn’t expressing doubts about numbers being a measure of beliefs (although you could certainly question this as well), but about extreme numbers being a measure of our beliefs, which do not seem able to be that extreme. Yes, if you have a large number of independent probabilities, the result can be extreme. And supposedly, the basis for saying that Islam (or reincarnation, or whatever) is very improbable would be the complexity of the claim. But who has really determined how much complexity it has? As I pointed out elsewhere (on the “Believable Bible” comment thread), a few statements, if we knew them to be true, would justify Islam or any other such thing. Which particular statements would we need, and how complex are those statements, really? No one has determined them to any degree of precision, and until they do, you have to use something like a base rate. Just as astronomers start out with fairly high probabilities for the collision of near-earth asteroids, and only end up with low probabilities after very careful calculation, you would have to start out with a fairly high prior for Islam, or reincarnation, or whatever, and you would only be justified in holding an extreme probability after careful calculation… which I don’t believe you’ve done. Certainly I haven’t.
Apart from the complexity, there is also the issue of evidence. We’ve been assuming all along that there is no evidence for Islam, or reincarnation, or whatever. Certainly it’s true that there isn’t much. But that there is literally no evidence for such things simply isn’t so. The main thing is that we aren’t motivated to look at the little evidence that there is. But if you intend to assign probabilities to that degree of precision, you are going to have to take into account every speck of evidence.
I thought the salient feature of Islam was that many people believed it, not that it has less complexity than I thought, or more evidence in its favor than I thought. That might be, but I’m not interested in discussing it.
I don’t “feel” beliefs strongly or weakly. Sometimes probability calculations help me with fear and other emotions, sometimes they don’t. Again, I’m not interested in discussing it.
So tell me something about how important it is that many people believe in Islam.
I’m not interested in discussing Islam either… those points apply to anything that people believe. But that’s why it’s relevant to the question of belief: if you take something that people don’t believe, it can be arbitrarily complex, or 100% lacking in evidence (like Russell’s teapot), but things that people believe do not have these properties.
It’s not important how many people believe it. It could be just 50 people and the probability would not be much different (as long as the belief was logically consistent with the fact that just a few people believed it.)
So tell me why. By “complex” do you just mean “low probability,” or some notion from information theory? How did you come to believe that people cannot believe things that are too complex?
I just realized that you may have misunderstood my original point completely. Otherwise you wouldn’t have said this: “I thought the salient feature of Islam was that many people believed it, not that it has less complexity than I thought, or more evidence in its favor than I thought.”
I only used the idea of complexity because that was komponisto’s criterion for the low probability of such claims. The basic idea is people believe things that their priors say do not have too low a probability: but as I showed in the post on Occam’s razor, everyone’s prior is a kind of simplicity prior, even if they are not all identical (nor necessarily particularly related to information theory or whatever.)
Basically, a probability is determined by the prior and by the evidence that it is updated according to. The only reason things are more probable if people believe them is that a person’s belief indicates that there is some human prior according to which the thing is not too improbable, and some evidence and way of updating that can give the thing a reasonable probability. So other people’s beliefs are evidence for us only because they stand in for the other people’s priors and evidence. So it’s not that it is “important that many people believe” apart from the factors that give it probability: the belief is just a sign that those factors are there.
Going back the distinction you didn’t like, between a fixed probability device and a real world claim, a fixed probability device would be a situation where the prior and the evidence is completely fixed and known: with the example I used before, let there be a lottery that has a known probability of one in a trillion. Then since the prior and the evidence are already known, the probability is still one in a trillion, even if someone says he is definitely going to win it.
In a real world claim, on the other hand, the priors are not well known, and the evidence is not well known. And if I find out that someone believes it, I immediately know that there are humanly possible priors and evidence that can lead to that belief, which makes it much more probable even for me than it would be otherwise.
This sounds like you are updating. We have a formula for what happens when you update, and it indeed says that given evidence, something becomes more probable. You are saying that it becomes much more probable. What quantity in Bayes formula seems especially large to you, and why?
What Wei Dai said.
In other words, as I said before, the probability that people believe something shouldn’t be that much more than the probability that the thing is true.
What about the conjunction fallacy?
The probability that people will believe a long conjunction is less probable than they will believe one part of the conjunction (because in order to believe both parts, they have to believe each part. In other words, for the same reason the conjunction fallacy is a fallacy.)
The conjunction fallacy is the assignment of a higher probability to some statement of the form A&B than to the statement A. It is well established that for certain kinds of A and B, this happens.
The fallacy in your proof that this cannot happen is that you have misstated what the conjunction fallacy is.
My point in mentioning it is that people committing the fallacy believe a logical impossibility. You can’t get much more improbable than a logical impossibility. But the conjunction fallacy experiments demonstrate that is common to believe such things.
Therefore, the improbability of a statement does not imply the improbability of someone believing it. This refutes your contention that “the probability that people believe something shouldn’t be that much more than the probability that the thing is true.” The possible difference between the two is demonstrably larger than the range of improbabilities that people can intuitively grasp.
I wish I had thought of this.
You said it before, but you didn’t defend it.
Wei Dai did, and I defended it by referencing his position.
In that case I am misunderstanding Wei Dai’s point. He says that complexity considerations alone can’t tell you that probability is small, because complexity appears in the numerator and the denominator. I will need to see more math (which I guess cousin it is taking care of) before understanding and agreeing with this point. But even granting it I don’t see how it implies that P(many believe H)/P(H) is for all H less than one billion.