Understanding rationality vs. ideology debates

[This was originally intended as a Facebook post, but it grew to the point where it felt way too long for that.]

For a while now I’ve been trying to wrap my head around what I’ll loosely call “rationality vs. ideology” debates. Very roughly, these are debates where at least at first glance one side seems to be saying that ideology is more important than rationality in some way (or that rationality is at least partly defined in ideological terms), and the other side is saying the reverse. I’m particularly interested in debates where there seem to be very thoughtful and well-meaning people on both sides. My personal intuition is strongly towards the “rationality-first” approach, at least for many such debates, so for a long time I’ve felt confused about trying to understand where the “ideology-first” side is coming from.

The rest of this post is where I’ve gotten to so far in my attempts to understand this, although it still feels incomplete. Interested to hear what other people think of it.


If we get a bit more fine-grained about this, it seems to me like there are several different types of debates that might be going on. Unfortunately though it’s not always clear which type is actually involved in any given debate, and it can easily be several types at the same time.

1) Debates about epistemology

1a) Debates over the nature of truth

Some approaches to truth say that ideology, pragmatic considerations, or the like are at least partially what makes something true or false, while others strongly disagree. Proponents of ideology- or pragmatically-based approaches might argue that all epistemic approaches are inescapably based on “ideology” of one sort or another, so explicitly tying a theory of truth to ideological or pragmatic considerations shouldn’t in itself make the theory any less plausible than supposedly evidence-based approaches. (I once wrote a short essay about this in relation to religious beliefs, but I think a lot of that essay applies to other types of beliefs as well—see there for more details.)

1b) Debates over normative belief formation

Some think that even if truth is maybe not technically determined by ideological or pragmatic considerations, it might still be ok or even preferable in some cases to form our beliefs or psychological dispositions based on these kinds of considerations. Basically, it’s ok or even preferrable in some cases to get ourselves to believe something other than the truth. (Proponents of this might point to the fact that everybody seems to do this anyway, to some degree. See for example Robin Hanson’s The Elephant in the Brain, among many other such books.)

1c) Debates over norms of inquiry

Some might think that even if we shouldn’t deliberately try to get ourselves to believe something other than the truth, we should still use ideological /​ pragmatic /​ similar considerations in considering how to inquire after the truth. For example, maybe some topics are just no-go zones where we should not inquire beyond a certain point because it might lead to bad beliefs.

Note that the types of debates above, particularly 1b and 1c, might be different on a personal vs. societal level. Maybe we think that for certain individuals it’s ok to pursue the truth wherever it may lead, but on a societal level most people should have ideology trump cold-hearted truth-seeking. There might also be a difference between different societal groups here—e.g., scientists vs. laypeople.

2) Debates about discourse norms

2a) Debates over what can /​ should be discussed out loud

Maybe we can agree on what the technical truth is and what a rational person should believe in private, but we might still debate whether such things should be said out loud. This might depend on the audience, for example technical journals vs. social media, or in private conversations vs. in a public talk.

This is also related to debates over paternalism—how much should we let the general public think through things for themselves vs. telling them what to think.

2b) Debates over rhetorical norms

Is it ok or even preferable to use rhetorical tricks to convince others that our ideology is right? Can we use insults or ad hominem attacks? Can we fudge the truth or even outright lie?

2c) Debates over enforcement of discourse norms

Maybe we agree that certain things shouldn’t be discussed in a certain forum or in a certain way. But what do we do if someone else tries to discuss it anyway? Should we talk to them politely and tell them to stop? Should we yell them down? Cancel them? Maybe even try to jail them?

3) Object level debates

3a) Debates over ethics and values

Sometimes the debate seems to be about normative ethical theories—consequentialism vs. deontology vs. egalitarianism vs. decolonial ethics theories, etc. Sometimes the debate is more on the applied ethics level. Sometimes it might even be traceable to the meta-ethical level.

See also Moral Foundations Theory and related approaches to understanding the origins of ethical debates.

3b) Debates over facts and evidence

Sometimes upon inspection the debate turns out not be about rationality vs. ideology after all, and it’s actually a debate over what the evidence says and/​or how to interpret it.

3c) Debates over context or impact

Maybe we agree that in certain contexts we should prioritize ideology over rationality in one way or another, but we debate whether this particular instance is in fact a case of one of those contexts. For example, maybe both sides think that certain unusual discourse norms apply when a group is threatened with actual violence, but they disagree over whether that is in fact the case here.

Similarly, maybe we agree that if a certain epistemological approach or discourse norm would lead to result xyz then we should go with some other approach, but we disagree over whether the epistemological approach /​ discourse norm will in fact likely lead to xyz. For example, perhaps we agree that if allowing people to talk publicly about a certain topic would lead to innocent people dying, then we shouldn’t allow people to talk about that topic publicly. But we can of course still debate whether letting people talk publicly about the topic is likely to lead to innocent people dying.

(Note that 3c is really a subset of 3b—debates over facts and evidence, but it feels common enough and important enough to separate out on its own.)

4) People talking past each other

4a) Norms around charitable interpretations

Even if both sides agree on epistemological approaches and norms of discourse, that doesn’t necessarily mean they will try all that hard to understand the other side’s point of view or to give charitable interpretations to their opponents’ statements. They may even agree (on an abstract theoretical level, at least) that neither side is obligated to try that hard to understand their opponents’ point of view or to charitably interpret their statements.

4b) Failures of interpretation

Really understanding a very different point of view can be really, really hard. I am psychologically inclined towards trying to understand very different points of view than my own (hence this post!), and I’ve been trying to do so for many years, but I still often find it extremely difficult. It should be no surprise then that people often fail at this. It’s a rare person who can pass an Ideological Turing Test with flying colors.

4c) Merely verbal disputes

David Chalmers has a famous essay where he asks whether much of philosophy is actually “merely verbal disputes” where different philosophers are using subtly different definitions for the concepts they’re discussing and therefore talking past each other. This of course can easily apply to disputes outside of philosophy as well. See also Eliezer Yudkowsky’s “Taboo Your Words”.

(As an aside: From my admittedly limited reading on the topic, it seemed to me that a lot of the debates about whether philosophy is largely merely verbal disputes might come down to different philosophers using the term “merely verbal dispute” in subtly different ways. In other words, I suspect that they’re having a merely verbal dispute about whether much of philosophy is merely verbal disputes!)

4d) Different simulacra levels

Zvi Mowshowitz describes four “simulacra levels” that people might be using when they say something like “x is true”. Very roughly:

  • Level 1 is that when they say “x is true” they actually mean to convey their own belief that x is in fact true.

  • Level 2 is that they may or may not themselves believe that x is true, but they want to get the audience to believe that x is true.

  • Level 3 is that when they say “x is true” they don’t really care whether or not x is in fact true, and they don’t care whether their listeners come to believe x is true, but instead they are just signalling their tribal affiliation with those who typically say things like “x is true.”

  • Level 4 is that they’re not really trying to convey anything at all with the statement “x is true” and instead they’re just repeating words that they or their audience seem to like—the statement “x is true” has to do with the vibes or pattern-matching of the words “x is true” rather than any meaning that might be conveyed by those words.

As Zvi describes, many “debates” seem to come down to one side talking on one simulacra level and the other talking on a different level. Often but not always this is because one or both sides are misinterpreting which level(s) the other side is talking on.

[4e) Other communication failures]

Miscommunications can happen for lots of reasons. Maybe one side wrote up their views in a bunch of confusing or poorly written articles that the other side just didn’t fully understand. Maybe one side failed to clearly explain their side at all, or the other side didn’t have the time or patience to read through the details to understand their opponents’ viewpoints.

5) The cynical take

And then of course there is the view that one or both sides aren’t actually being rational even when they say they are, even when they appear to be thoughtful and well-meaning, and sometimes even when they honestly believe they’re being rational. Everybody is subject to cognitive biases, of course, but there are also all sorts of other biases—biases due to money, or prestige, or peer group pressures, or any other number of things. And sometimes people can appear to be sincere rational truth-seekers on the outside but on the inside they’re secretly and knowingly pursuing some other goal.


Other related notions that I’m not sure where exactly they fit into this breakdown, if they do: