I didn’t say you didn’t have good rationalist techniques. I said I don’t know whether good rationalist techniques lead to practical benefits.
Most smart people already have a naive version of this technique: that when all the evidence is going against them, they need to stop and think about whether their beliefs are right or wrong. To show that Crisis of Faith is practically valuable, you need evidence that:
(1) EITHER people don’t apply the naive technique often enough in situations where it could give practical real-world benefits, and formalizing it will convince them to do it more often,
(2) OR that the specific advice you give in the Crisis of Faith post makes a full-blown Crisis of Faith more likely to return the correct answer than the naive technique.
(3) AND that once they finish Crisis of Faith, they go through with their decision.
I give (1) low probability. People don’t change their religious or political views often enough, but they’re often good at changing their practical situations. I’ve heard many people tell stories of how they stayed up all night agonizing over whether or not to break up with a girlfriend. In many cases I think the difficulty is in reaching the point where you admit “I need to seriously reconsider this.” I doubt many people reach a point where they feel uncomfortable about their position on a practical issue but don’t take any time to think it over. And the people who would be interested in rationalism are probably exactly the sort of people who currently use the naive technique most often already. I used a naive version of Crisis of Faith for a very important decision in my life before reading OB.
I give (2) high but not overwhelming probability. Yes, all of this ought to work. But I was reading up on evidence-based medicine last night in response to your comment, and one thing that struck me the most was that “ought to work” is a very suspicious phrase. Doctors possessing mountains of accurate information about liver function can say “From what we know about the liver, it would be absolutely absurd for this chemical not to cure liver disease” and then they do a study and it doesn’t help at all. With our current state of knowledge and the complexity of the subject, it’s easier to make mistakes about rationality than about the liver. Yes, my completely non-empirical gut feeling is that the specific components of Crisis of Faith should work better than just sitting and thinking, but maybe anyone unbiased enough to succeed at Crisis of Faith is unbiased enough to succeed at the naive method. Maybe Crisis of Faith creates too much pressure to reject your old belief in order to signal rationality, even if the old belief was correct.
I give (3) medium probability.
But all this is an upper bound. The question not considered is whether some specific training program will teach people to use Crisis of Faith regularly and correctly. I predict that the “training program” of reading about it on Overcoming Bias in most cases does not, but this is easy to test:
Everyone please comment below if (how many times?) you’ve actually used the full-blown formal Crisis of Faith technique in a situation where you wouldn’t have questioned something if you hadn’t read the Crisis of Faith article. Please also mention whether it was about a practical real-world matter, and whether you ended up changing your mind
A more formal training program might do better, but that this would still be the most serious bottleneck.
I didn’t say you didn’t have good rationalist techniques. I said I don’t know whether good rationalist techniques lead to practical benefits.
Most smart people already have a naive version of this technique: that when all the evidence is going against them, they need to stop and think about whether their beliefs are right or wrong. To show that Crisis of Faith is practically valuable, you need evidence that:
(1) EITHER people don’t apply the naive technique often enough in situations where it could give practical real-world benefits, and formalizing it will convince them to do it more often, (2) OR that the specific advice you give in the Crisis of Faith post makes a full-blown Crisis of Faith more likely to return the correct answer than the naive technique. (3) AND that once they finish Crisis of Faith, they go through with their decision.
I give (1) low probability. People don’t change their religious or political views often enough, but they’re often good at changing their practical situations. I’ve heard many people tell stories of how they stayed up all night agonizing over whether or not to break up with a girlfriend. In many cases I think the difficulty is in reaching the point where you admit “I need to seriously reconsider this.” I doubt many people reach a point where they feel uncomfortable about their position on a practical issue but don’t take any time to think it over. And the people who would be interested in rationalism are probably exactly the sort of people who currently use the naive technique most often already. I used a naive version of Crisis of Faith for a very important decision in my life before reading OB.
I give (2) high but not overwhelming probability. Yes, all of this ought to work. But I was reading up on evidence-based medicine last night in response to your comment, and one thing that struck me the most was that “ought to work” is a very suspicious phrase. Doctors possessing mountains of accurate information about liver function can say “From what we know about the liver, it would be absolutely absurd for this chemical not to cure liver disease” and then they do a study and it doesn’t help at all. With our current state of knowledge and the complexity of the subject, it’s easier to make mistakes about rationality than about the liver. Yes, my completely non-empirical gut feeling is that the specific components of Crisis of Faith should work better than just sitting and thinking, but maybe anyone unbiased enough to succeed at Crisis of Faith is unbiased enough to succeed at the naive method. Maybe Crisis of Faith creates too much pressure to reject your old belief in order to signal rationality, even if the old belief was correct.
I give (3) medium probability.
But all this is an upper bound. The question not considered is whether some specific training program will teach people to use Crisis of Faith regularly and correctly. I predict that the “training program” of reading about it on Overcoming Bias in most cases does not, but this is easy to test:
Everyone please comment below if (how many times?) you’ve actually used the full-blown formal Crisis of Faith technique in a situation where you wouldn’t have questioned something if you hadn’t read the Crisis of Faith article. Please also mention whether it was about a practical real-world matter, and whether you ended up changing your mind
A more formal training program might do better, but that this would still be the most serious bottleneck.