That’s an excellent way of putting it, which brings a lot of clarity to my clumsy exposition! To answer your question, yes, the same essential mechanism I discussed is at work in both progressive and traditional biases—the desire that facts should provide convenient support for normative beliefs causes bias in factual beliefs, regardless of whether these normative beliefs are cherished as achievements of progress or revered as sacred tradition. However, I think there are important practical differences that merit some separate consideration.
The problem is that traditionalist vs. progressive biases don’t appear randomly. They are correlated with many other relevant human characteristics. In particular, my hypothesis is that people with formidable rational thinking skills—who, compared to other people, have much less difficulty with overcoming their biases once they’re pointed out and critically dissecting all sorts of unpleasant questions—tend to have a very good detector for biases and false beliefs of the traditionalist sort, but they find it harder to recognize and focus on those of the progressive sort.
What this means is that in practice, when exceptionally rational people see some group feeling good about their beliefs because these beliefs are a revered tradition, they’ll immediately smell likely biases and turn their critical eye on it. On the other hand, when they see people feeling good about their beliefs because they are a result of progress over past superstition and barbarism, they are in danger of assuming without justification that the necessary critical work has already been done, so everything is OK as it is. Also, in the latter sort of situation, they will relatively easily assume that the only existing controversy is between the rational progressive view and the remnants of the past superstition, although reality could be much more complex. This could even conceivably translate into support for the mainstream progressive view even if it has strayed into all sorts of biases and falsities.
So, basically, when we consider what biases and false beliefs could be hiding in things that are presently a matter of consensus, things that it just doesn’t even occur to anyone reputable to question, it seems to me that there is a greater chance of finding those that are hiding in your (3ai) category than in the rest of (3a). Thus, I would propose a heuristic that, I believe, has the potential to detect a lot of biases we are unaware of: just like you get suspicious as soon as you see people happy and content with their traditional beliefs, you should also get suspicious whenever you see a consensus that progress has been achieved on some issue, both normatively and factually, where however the factual part is not supported by strict hard-scientific evidence and there is a high degree of normative/factual entanglement.
What this means is that in practice, when exceptionally rational people see some group feeling good about their beliefs because these beliefs are a revered tradition, they’ll immediately smell likely biases and turn their critical eye on it. On the other hand, when they see people feeling good about their beliefs because they are a result of progress over past superstition and barbarism, they are in danger of assuming without justification that the necessary critical work has already been done, so everything is OK as it is.
This sounds like an interesting idea to me, and I hope it winds up in whatever fuller exposition of your ideas you end up posting.
That’s an excellent way of putting it, which brings a lot of clarity to my clumsy exposition! To answer your question, yes, the same essential mechanism I discussed is at work in both progressive and traditional biases—the desire that facts should provide convenient support for normative beliefs causes bias in factual beliefs, regardless of whether these normative beliefs are cherished as achievements of progress or revered as sacred tradition. However, I think there are important practical differences that merit some separate consideration.
The problem is that traditionalist vs. progressive biases don’t appear randomly. They are correlated with many other relevant human characteristics. In particular, my hypothesis is that people with formidable rational thinking skills—who, compared to other people, have much less difficulty with overcoming their biases once they’re pointed out and critically dissecting all sorts of unpleasant questions—tend to have a very good detector for biases and false beliefs of the traditionalist sort, but they find it harder to recognize and focus on those of the progressive sort.
What this means is that in practice, when exceptionally rational people see some group feeling good about their beliefs because these beliefs are a revered tradition, they’ll immediately smell likely biases and turn their critical eye on it. On the other hand, when they see people feeling good about their beliefs because they are a result of progress over past superstition and barbarism, they are in danger of assuming without justification that the necessary critical work has already been done, so everything is OK as it is. Also, in the latter sort of situation, they will relatively easily assume that the only existing controversy is between the rational progressive view and the remnants of the past superstition, although reality could be much more complex. This could even conceivably translate into support for the mainstream progressive view even if it has strayed into all sorts of biases and falsities.
So, basically, when we consider what biases and false beliefs could be hiding in things that are presently a matter of consensus, things that it just doesn’t even occur to anyone reputable to question, it seems to me that there is a greater chance of finding those that are hiding in your (3ai) category than in the rest of (3a). Thus, I would propose a heuristic that, I believe, has the potential to detect a lot of biases we are unaware of: just like you get suspicious as soon as you see people happy and content with their traditional beliefs, you should also get suspicious whenever you see a consensus that progress has been achieved on some issue, both normatively and factually, where however the factual part is not supported by strict hard-scientific evidence and there is a high degree of normative/factual entanglement.
This sounds like an interesting idea to me, and I hope it winds up in whatever fuller exposition of your ideas you end up posting.