Contra double crux

Summary: CFAR proposes double crux as a method to resolve disagreement: instead of arguing over some belief B, one should look for a crux (C) which underlies it, such that if either party changed their mind over C, they would change their mind about B.

I don’t think double crux is that helpful, principally because ‘double cruxes’ are rare in topics where reasonable people differ (and they can be asymmetric, be about a considerations strength rather than direction, and so on). I suggest this may diagnose the difficulty others have noted in getting double crux to ‘work’. Good philosophers seem to do much better than double cruxing using different approaches.

I aver the strengths of double crux are primarily other epistemic virtues, pre-requisite for double crux, which are conflated with double cruxing itself (e.g. it is good to have a collaborative rather than combative mindset when disagreeing). Conditional on having this pre-requisite set of epistemic virtues, double cruxing does not add further benefit, and is probably inferior to other means of discussion exemplified by good philosophers. I recommend we look elsewhere.

What is a crux?

From Sabien’s exposition, a crux for some belief B is another belief C which if one changed one’s mind about C, one would change one’s mind about B. The original example was the impact of school uniforms concealing unhelpful class distinctions being a crux for whether one supports or opposes school uniforms.

A double crux is a particular case where two people disagree over B and have the same crux, albeit going in opposite directions. Say if Xenia believes B (because she believes C) and Yevgeny disbelieves B (because he does not believe C), then if Xenia stopped believing C, she would stop believing B (and thus agree with Yevgeny) and vice-versa.

How common are cruxes (and double cruxes)?

I suggest the main problem facing the ‘double crux technique’ is that disagreements like Xenia’s and Yevgeny’s, which can be eventually traced to a single underlying consideration, are the exception rather than the rule. Across most reasonable people on most recondite topics, ‘cruxes’ are rare, and ‘double cruxes’ (roughly) exponentially rarer.

For many recondite topics I think about, my credence it in arises from the balance of a variety of considerations pointing in either direction. Thus whether or not I believe ‘MIRI is doing good work’, ‘God exists’, or ‘The top marginal tax rate in the UK should be higher than its current value’ does not rely on a single consideration or argument, but rather its support is distributed over a plethora of issues. Although in some cases undercutting what I take as the most important consideration would push my degree of belief over or under 0.5, in other cases it would not.

Thus if I meet someone else who disagrees with me on (say) whether God exists, it would be remarkable if our disagreement hinges on (for example) the evidential argument of evil, such that if I could persuade them of its soundness they would renounce their faith, and vice versa. Were I persuaded the evidential argument from evil ‘didn’t work’, I expect I would remain fairly sceptical of god’s existence; were I to persuade them it ‘does work’, I would not be surprised if they maintained other evidence nonetheless makes god’s existence likely on the total balance of evidence. And so on and so forth for other issues where reasonable people disagree. I suspect a common example would be reasonably close agreement on common information, yet beliefs diverging based on ‘priors’, comprised of a melange of experiences, gestalts, intuitions, and other pieces of more ‘private’ evidence.

Auxiliary challenges to double crux

I believe there are other difficulties with double crux, somewhat related to the above:


As implied above, even in cases where there is a crux C for person X believing B, C may not be a crux for B for person Y, but it might be something else (A?). (Or, more generally, X and Y’s set of cruxes are disjoint). A worked example:

Carl Shulman and I disagree about whether MIRI is doing good research (we have money riding on it). I expect if I lose the bet, I’d change my mind substantially about the quality of MIRI’s work (i.e. my view would be favourable rather than unfavourable). I don’t see this should be symmetrical between Shulman and I. If he lost the bet, he may still have a generally favourable view of MIRI, and a ‘crux’ for him maybe some other evidence or collection of evidence.

X or Y may simply differ in the resilience of their credence in B, such that one or the other’s belief shifts more on being persuaded on a particular consideration. One commoner scenario (intra-EA) would be if one is trying to chase ‘hits’, one is probably more resilient to subsequent adverse information than the initial steers that suggested a given thing could be hit.

A related issue is when one person believes they are in receipt of a decisive consideration for or against B their interlocutor is unaware of.

‘Changing one’s mind’ around p=0.5 isn’t (that) important

In most practical cases, a difference between 1 and 0.51 or 0 and 0.49 is much more important than between 0.49 and 0.51. Thus disagreements over confidence of dis/​belief can be more important, even if they may not count as ‘changing one’s mind’: I probably differ more with a ‘convinced Atheist’ than an a ‘doubter who leans slightly towards Theism’.

Many arguments and considerations are abductive, and so lend strength to a particular belief. Thus a similar challenge applies to proposed cruxes—they may regard the strength, rather than direction, of a given consideration. One could imagine the ‘crux’ between me and the hypothetical convinced Atheist is they think that the evidential problem of evil provides overwhelming disconfirmation for Theism, whilst I think its persuasive, perhaps decisive, but not so it drives reasonable credence in Theism down to near-zero.

Sabien’s exposition recognises this, and so suggests one can ‘double crux’ over varying credences. So in this sample disagreement, the belief is ‘Atheism is almost certain’, and the crux is ‘the evidential argument from evil is overwhelming’. Yet our language for credences is fuzzy, and so what would be a crux for the difference between (say) ‘somewhat confident’ versus ‘almost certain’ hard to nail down in a satisfactory inter-subjective way. An alternative where a change of raw credence is ‘changing ones mind’ entails all considerations we take to support our credence in a given a belief are cruxes.


I suggest these difficulties may make a good diagnosis for why double cruxing has not always worked well. Anecdata seems to vary from those who have found it helpful to those who haven’t seen any benefit (but perhaps leaning towards the latter), and remarks along the lines of wanting to see a public example.

Raemon’s subsequent exegesis helpfully distinguishes between the actual double crux technique. and “the overall pattern of behaviour surrounding this Official Double Crux technique”. They also offer a long list of considerations around the latter which may be pre-requisite for double cruxing working well (e.g. Social Skills, Actually Changing your Mind, and so on).

I wonder what value double crux really adds, if Raemon’s argument is on the right track. If double cruxing requires many (or most) of the pre-requisites suggested, all disagreements conditioned on meeting these pre-requisites will go about as well whether one uses double crux or some other intuitive means of subsequent discussion.

A related concern of mine is a ‘castle-and-keep’ esque defence of double crux which arises from equivocating between double crux per se and a host of admirable epistemic norms it may rely upon. Thus when defended double crux may transmogrify from “look for some C which if you changed your mind about you’d change your mind about B too” to a large set of incontrovertibly good epistemic practices “It is better to be collaborative rather than combative in discussion, and be willing to change ones mind, (etc.)” Yet even if double cruxing is associated with (or requires) these good practices, it is not a necessary condition for them.

Good philosophers already disagree better than double cruxing

To find fault, easy; to do better, difficult

- Plutarch (paraphrased)

Per Plutarch’s remark, any shortcomings in double crux may count for little if it is the ‘best we’ve got’. However, I believe I can not only offer a better approach, but this approach already exists ‘in the wild’. I have the fortune of knowing many extraordinary able philosophers, and not only observe their discussions but (as they also have extraordinary reserves of generosity and forbearance) participate in them as well. Their approach seems to do much better than reports of what double cruxing accomplishes.

What roughly happens is something like this:

  1. X and Y realize their credences on some belief B vary considerably.

  2. X and Y both offer what appear (to their lights) the strongest considerations that push them to a higher/​lower credence on B.

  3. X and Y attempt to prioritize these considerations by the sensitivity of credence in B is to each of these, via some mix of resilience, degree of disagreement over these considerations, and so forth.

  4. They then discuss these in order of priority, moving topics when the likely yield drops below the next candidate with some underlying constraint on time.

This approach seems to avoid the ‘in theory’ objections I raise against double crux above. It seems to avoid some of the ‘in practice’ problems people observe:

  • These discussions often occur (in fencing terms) at double time, and thus one tends not to flounder trying to find ‘double-cruxy’ issues. Atheist may engage Theist on attempting to undermine the free will defence to the argument from evil, whilst Theist may engage Atheist on the deficiencies of moral-antirealism to prepare ground for a moral argument for the existence of god. These may be crux-y but they may be highly assymetrical. Atheist may be a compatibilist but grant libertarian free will for the sake of argument, for example: thus Atheist’s credence in God will change little even if persuaded the free will defence broadly ‘checks out’ if one grants libertarian free will, and vice versa.

  • These discussions seldom get bogged down in fundamental disagreements. Although Deontologist and Utilitarian recognise their view on normative ethics is often a ‘double crux’ for many applied ethics questions (e.g. Euthanasia), they mutually recognise their overall view on normative ethics will likely be sufficiently resilient such that either of them ‘changing their mind’ based on a conversation is low. Instead they turn their focus to other matters which are less resilient, and thus they anticipate a greater likelihood of someone or other changing their mind.

  • There appears to be more realistic expectations about the result. If Utilitarian and Deontologist do discuss the merits of utilitarianism versus (say) kantianism, there’s little expectation of ‘resolving their disagreement’ or that they will find or mutually crucial considerations. Rather they pick at a particular leading consideration on either side and see whether it may change their confidence (this is broadly reflected in the philosophical literature: papers tend to concern particular arguments or considerations, rather then offering all things considered determinations of broad recondite philosophical positions).

  • There appear to be better stopping rules. On the numerous occasions where I’m not aware of a particularly important consideration, it often seems a better use of everyone’s time for me to read about this in the relevant literature rather than continuing to discuss (I’d guess ‘reading a book’ beats ‘discussing with someone you disagree with’ on getting more accurate beliefs about a topic per unit time surprisingly often).

Coda: Wherefore double crux?

It is perhaps possible for double crux to be expanded or altered to capture the form of discussion I point to above, and perhaps one can recast all the beneficial characteristics I suggest in double crux verbiage. Yet such a program appears a fool’s errand: the core of the idea of double crux introduced at the top of the post is distinct from generally laudable epistemic norms (c.f. Intermezzo, supra), but also the practices of the elite cognisers I point towards in the section above. A concept of double crux so altered to incorporate these things is epiphenomenal—the engine which is driving the better disagreement is simply those other principles and practices double crux has now appropriated, and its chief result is to add terminological overhead, and, perhaps, inapt approximation.

I generally think the rationalist community already labours under too much bloated jargon: words and phrases which are hard for outsiders to understand, and yet do not encode particularly hard or deep concepts. I’d advise against further additions to the lexicon. ‘Look for key considerations’ captures the key motivation for double crux better than ‘double crux’ itself, and its meaning is clear.

The practices of exceptional philosophers set a high bar: these are people selected for, and who practice heavily, argument and disagreement. It is almost conceivable that they are better than this than even the rationalist community, notwithstanding the vast and irrefragable evidence of this group’s excellence across so many domains. Double crux could still have pedagogical value: it might be a technique which cultivates better epistemic practices, even if those who enjoy excellent epistemic practices have a better alternative. Yet this does not seem the original intent, nor does there appear much evidence of this benefit.

In the introduction to double crux, Sabien wrote that the core concept was ‘fairly settled’. In conclusion he writes:

We think double crux is super sweet. To the extent that you see flaws in it, we want to find them and repair them, and we’re currently betting that repairing and refining double crux is going to pay off better than try something totally different. [emphasis in original]

I respectfully disagree. I see considerable flaws in double crux, which I don’t think have much prospect of adequate repair. Would that time and effort be spent better looking elsewhere.