I agree that for the examples you’re naming (e.g., demanding strong evidence/resisting social pressure), there is a failure mode that looks like you’re going too far (e.g., being excessively dogmatic/being contrarian).
However, I don’t think that this failure mode actually results from identifying the underlying principle and then taking it to the extreme, and I think that’s an important point to clarify. For example, in the first case, the principle I see is something like “demand strong evidence for strongly held beliefs” or even more generally “believe things only as strongly as evidence suggests.” I don’t think it’s obvious that this principle can be taken too far. In particular, I think the following
A famous spoof article jokes that we don’t know parachutes are reliable because we don’t have a randomised controlled trial.
is not an example of doing that. Rather, the mistake here is something like, “equating rationality with academic science.” We don’t have a formally conducted study on the effectiveness of parachutes, and if you think that’s the only evidence that counts, you might mistrust parachutes. But, as a matter of fact, we have excellent evidence to believe that parachutes work, and believing this evidence is perfectly rational. So you cannot arrive at a mistrust of parachutes by having high standards for evidence, you can only arrive at it by being wrong about what kind of evidence does and doesn’t count.
Again, I only mean this as a clarification, not as a counterpoint. It is still absolutely possible to go wrong in the ways you describe, and avoiding that is important.
I agree that for the examples you’re naming (e.g., demanding strong evidence/resisting social pressure), there is a failure mode that looks like you’re going too far (e.g., being excessively dogmatic/being contrarian).
However, I don’t think that this failure mode actually results from identifying the underlying principle and then taking it to the extreme, and I think that’s an important point to clarify. For example, in the first case, the principle I see is something like “demand strong evidence for strongly held beliefs” or even more generally “believe things only as strongly as evidence suggests.” I don’t think it’s obvious that this principle can be taken too far. In particular, I think the following
is not an example of doing that. Rather, the mistake here is something like, “equating rationality with academic science.” We don’t have a formally conducted study on the effectiveness of parachutes, and if you think that’s the only evidence that counts, you might mistrust parachutes. But, as a matter of fact, we have excellent evidence to believe that parachutes work, and believing this evidence is perfectly rational. So you cannot arrive at a mistrust of parachutes by having high standards for evidence, you can only arrive at it by being wrong about what kind of evidence does and doesn’t count.
Again, I only mean this as a clarification, not as a counterpoint. It is still absolutely possible to go wrong in the ways you describe, and avoiding that is important.