That Crisis thing seems pretty useful

Since there’s been much questioning of late over “What good is advanced rationality in the real world?”, I’d like to remind everyone that it isn’t all about post-doctoral-level reductionism.

In particular, as a technique that seems like it ought to be useful in the real world, I exhibit the highly advanced, difficult, multi-component Crisis of Faith aka Reacting To The Damn Evidence aka Actually Changing Your Mind.

Scanning through this post and the list of sub-posts at the bottom (EDIT: copied to below the fold) should certainly qualify it as “extreme rationality” or “advanced rationality” or “x-rationality” or “Bayescraft” or whatever you want to distinguish from “traditional rationality as passed down from Richard Feynman”.

An actual sit-down-for-an-hour Crisis of Faith might be something you’d only use once or twice in every year or two, but on important occasions. And the components are often things that you could practice day in and day out, also to positive effect.

I think this is the strongest foot that I could put forward for “real-world” uses of my essays. (Anyone care to nominate an alternative?)

Below the fold, I copy and paste the list of components from the original post, so that we have them at hand:

  • Avoiding Your Belief’s Real Weak Points—One of the first temptations in a crisis of faith is to doubt the strongest points of your belief, so that you can rehearse your good answers. You need to seek out the most painful spots, not the arguments that are most reassuring to consider.

  • The Meditation on Curiosity—Roger Zelazny once distinguished between “wanting to be an author” versus “wanting to write”, and there is likewise a distinction between wanting to have investigated and wanting to investigate. It is not enough to say “It is my duty to criticize my own beliefs”; you must be curious, and only uncertainty can create curiosity. Keeping in mind Conservation of Expected Evidence may help you Update Yourself Incrementally: For every single point that you consider, and each element of new argument and new evidence, you should not expect your beliefs to shift more (on average) in one direction than another—thus you can be truly curious each time about how it will go.

  • Cached Thoughts and Pirsig’s Original Seeing, to prevent standard thoughts from rushing in and completing the pattern.

  • The Litany of Gendlin and the Litany of Tarski: People can stand what is true, for they are already enduring it. If a belief is true you will be better off believing it, and if it is false you will be better off rejecting it. You would advise a religious person to try to visualize fully and deeply the world in which there is no God, and to, without excuses, come to the full understanding that if there is no God then they will be better off believing there is no God. If one cannot come to accept this on a deep emotional level, they will not be able to have a crisis of faith. So you should put in a sincere effort to visualize the alternative to your belief, the way that the best and highest skeptic would want you to visualize it. Think of the effort a religionist would have to put forth to imagine, without corrupting it for their own comfort, an atheist’s view of the universe.

  • Make an Extraordinary Effort, for the concept of isshokenmei, the desperate convulsive effort to be rational that it would take to surpass the level of Robert Aumann and all the great scientists throughout history who never let go of their religions.

  • The Genetic Heuristic: You should be extremely suspicious if you have many ideas suggested by a source that you now know to be untrustworthy, but by golly, it seems that all the ideas still ended up being right. (E.g., the one concedes that the Bible was written by human hands, but still clings to the idea that it contains indispensable ethical wisdom.)

  • The Importance of Saying “Oops”—it really is less painful to swallow the entire bitter pill in one terrible gulp.

  • Singlethink, the opposite of doublethink. See the thoughts you flinch away from, that appear in the corner of your mind for just a moment before you refuse to think them. If you become aware of what you are not thinking, you can think it.

  • Affective Death Spirals and Resist the Happy Death Spiral. Affective death spirals are prime generators of false beliefs that it will take a Crisis of Faith to shake loose. But since affective death spirals can also get started around real things that are genuinely nice, you don’t have to admit that your belief is a lie, to try and resist the halo effect at every point—refuse false praise even of genuinely nice things. Policy debates should not appear one-sided.

  • Hold Off On Proposing Solutions until the problem has been discussed as thoroughly as possible without proposing any; make your mind hold off from knowing what its answer will be; and try for five minutes before giving up, both generally, and especially when pursuing the devil’s point of view.

  • The sequence on The Bottom Line and Rationalization, which explains why it is always wrong to selectively argue one side of a debate.

  • Positive Bias and motivated skepticism and motivated stopping, lest you selectively look for support, selectively look for counter-counterarguments, and selectively stop the argument before it gets dangerous. Missing alternatives are a special case of stopping. A special case of motivated skepticism is fake humility where you bashfully confess that no one can know something you would rather not know. Don’t selectively demand too much authority of counterarguments.

  • Beware of Semantic Stopsigns, Applause Lights, and the choice to Explain/​Worship/​Ignore.

  • Feel the weight of Burdensome Details; each detail a separate burden, a point of crisis.