Rational Repentance

Related to: Branches of Rationality, Rationality Workbook

Changing your behavior to match new evidence could be harder than simply updating your beliefs and then mustering your willpower, because (a) we are in denial about how often we change our minds, (b) cognitive dissonance is tolerable in the medium-term, and (c) the additional monitoring required to verify that your actions as well as your beliefs have changed makes it easier for you to pretend that your actions are suitable to your reality. It might help to (1) specify a quitting point in advance, (2) demonstrate your new opinion with symbolic action, or (3) activate your emotions by reading non-rational propaganda. Additional solutions are eagerly solicited.

Disclaimer:

This post contains examples drawn from politics and current events. I do not hope to change anyone’s mind about any specific political belief, I know that Politics is the Mind-killer, I have tried to use non-inflammatory language, and I have a good faith belief that this post contains actual content on rationalism sufficient to justify its potentially controversial examples. Equally powerful but less controversial examples will be cheerfully substituted if anyone can bring them to my attention.

Review:

As has been amply discussed in the sequences, a key tool for overcoming the human tendency to irrationally defend prior beliefs simply because they are comfortable is to ask what, if anything, would cause you to abandon those beliefs. For example, in the “invisible dragon in the garage” parable, it quickly becomes clear to neutral observers that there is no potential evidence that could convince an invisible-dragon-fundamentalist that the dragon is fictional. If you test for breathing noises, it turns out that the dragon is inaudible. If you test for ecological impact, it turns out that the dragon lives off of invisible hamsters, etc. Thus we say that the belief in the dragon is unfalsifiable; there is no way to falsify your hypothesis that there is a dragon in your garage, and so your belief in the dragon does not pay rent in anticipated experiences.

There is a second human bias that causes you to cache an unrealistically high summary statistic for how often you change your mind: you think you change your mind, in general, pretty often, but unless you are an expert, highly-practiced rationalist, odds are that you do not. As evidence, try thinking of the last time you changed your mind about something and force yourself to specify what you believed beforehand and what you believed afterward. Me personally, I haven’t changed my mind about anything that I can remember since about November 10th, 2010, and I’m sure I’ve expressed thousands of opinions since then. The odds are long.

The Problem:

There is a third human bias that causes you to tell yourself that you have successfully changed your mind when you have not really done so. The adherent of the Reformed Church of Dragon leaves the garage door open, and cheerfully admits to anyone who asks that there is probably no such thing as an invisible dragon, yet she is unaccountably cautious about actually parking her car in the garage. Thus it is worth knowing not just how to change your mind, but how to change your habits in response to new information. This is a distinct skill from simply knowing how to fight akrasia, i.e., how to muster the willpower to change your habits in general.

One example of this failure mode, recently reported by Slate.com, involves American troops in Iraq: there are at least some regions in Iraq where many people strongly prefer not to have American troops around, and yet American troops persist in residing and operating there. In one such region, according to a former American soldier who was there, the people greeted the incoming foreigners with a large, peaceful protest, politely asking the Yankees to go home. When the request was ignored, locals began attacking the Americans with snipers and roadside bombs. According to the ex-soldier, Josh Steiber, the Americans responded not by leaving the region, but by ordering troops to shoot whoever happened to be around when a bomb went off, as a sort of reprisal killing. At that point, cognitive dissonance finally kicked in for Josh, who had volunteered for the military out of a sense of idealism, and he changed his mind about whether he should be in Iraq: he stopped following orders, went home, and sought conscientious objector status.

The interesting thing is that his comrades didn’t, even after seeing his example. The same troops in the same town confronted with the same evidence that their presence was unwelcome all continued to blame and kill the locals. One of Josh’s commanders wound up coming around to Josh’s point of view to the extent of being able to agree to disagree and give Josh a hug, but still kept ordering people to kill the locals. One wonders: what would it take to get the commander to change not just his mind, but his actions? What evidence would someone in his position have to observe before he would stop killing Iraqis? The theory is that American military presence in Iraq is good for Iraqis because it helps them build democracy, or security, or their economy, or some combination. It’s moderately challenging to concede that the theory could be flawed. But, assuming you have the rationalist chops to admit your doubt, how do you go about changing your actions to reflect that doubt? The answer isn’t to sit at home and do nothing; there are probably wars, or at the very least nonviolent humanitarian interventions, that are worth sending people abroad for (or going yourself, if you’re not busy). But if you can’t change your behavior once you arrive on the scene, your doubt is practically worthless—we could replace you with an unthinking, unquestioning patriot and get the same results.

Another example was reported by Bill McKibben, author of Deep Economy, who says he happened to be in the organic farming region of Gorasin, Bangladesh the day an international food expert arrived to talk about genetically engineered “golden rice,” which, unlike ordinary rice, is rich in Vitamin A and can prevent certain nutritional deficiency syndromes. “The villagers listened for a few minutes, and then they started muttering. Unlike most of us in the West who worried about eating genetically modified organisms, they weren’t much concerned about ‘frankenfood.’ Instead, they instantly realized that the new rice would require fertilizer and pesticide, meaning both illness and debt. More to the point, they kept saying, they had no need of golden rice because the leafy vegetables they could now grow in their organic fields provided all the nutrition they needed. ‘When we cook the green vegetables, we are aware not to throw out the water,’ said one woman. ‘Yes,’ said another. ‘And we don’t like to eat rice only. It tastes better with green vegetables.’”

Bill doesn’t say how the story ended, but one can imagine that there are many places like Gorasin where the villagers ended up with GMOs anyway. The November/​December 2010 issue of Foreign Affairs has a pretty good article (partial paywall) about how international food donors have persisted in shipping grain—sometimes right past huts full of soon-to-rot local stockpiles—when what is really needed are roads, warehouses, and loans. One could argue that the continued funding of food aid at 100 times the ratio of food infrastructure aid, or the continued encouragement of miracle mono-crops in the face of local disinterest, simply reflects the financial incentives of large agricultural corporations. Considering how popular farmers are and how unpopular foreign aid is, though, there are doubtless easier ways for Monsanto and ConAgra to get their government subsidies. At least some of the political support for these initiatives has to come from well-intentioned leaders who have reason to know that their policies are counterproductive but who are unable or unwilling to change their behavior to reflect that knowledge.

It sounds stupid when people act this stubbornly on the global stage, but it is surprisingly difficult not to be stubborn. What if anything, would convince you to stop (or start) eating animals? Not merely to admit, verbally, that it is an acceptable thing for others to do, or even the moral or prudent thing for you to do, but to actually start trying to do it? What, if anything, would convince you to stop (or start) expecting monogamy in your romantic relationships? To save (or borrow) significant amounts of money? To drop one hobby and pick up another? To move across the country?

And, here’s the real sore spot: how do you know? Suppose you said that you would save $1,000 a year if the real interest rate were above 15%. Would you really? What is your reference class for predicting your own behavior? Have you made a change like that before in your life? How did the strength of the evidence you thought it would take to change your behavior compare to the evidence it actually took to change your behavior? How often do you make comparably drastic changes? How often do you try to make such changes? Which are you more likely to remember—the successful changes, or the failed and quickly aborted attempts?

Solutions:

Having just recently become explicitly aware of this problem, I’m hardly an expert on how to solve it. However, for whatever it is worth, here are some potential coping mechanisms. Additional solutions are strongly encouraged in the comments section!

1) Specify a quitting point in advance. If you know ahead of time what sort of evidence, E, would convince you that your conduct is counterproductive or strictly dominated by some other course of conduct, then switching to that other course of conduct when you observe that evidence will feel like part of your strategy. Instead of seeing yourself as adopting strategy A and then being forced to switch to strategy B because strategy A failed, you can see yourself as adopting the conditional strategy C, which calls for strategy A in circumstance E and for strategy B in circumstance ~E. That way your success is not dependent on your commitment, which should help reduce your commitment down toward an optimal level.

Without a pre-determined quitting point, you run the risk of making excuses for an endless series of marginal increases in the strength of the evidence required to make a change of action appropriate. Sunk costs may be an economic fallacy, but they’re a psychological reality.

2) Demonstrate your new opinion with symbolic action. Have you decided to move to San Francisco, even though your parents and significant other swear they’ll never visit you there? Great! We have nice weather here; look forward to seeing you as soon as you can land a job. Meanwhile, buy a great big map of our beautiful city and put it on your bedroom wall. The map, in and of itself, doesn’t get you a damn bit closer to moving here. It doesn’t purport to influence your incentives the way a commitment contract would. What it does do is help you internalize your conscious decision so the decision is more broadly endorsed by the various aspects of your psyche.

I remember at one point a religious camp counselor caught me using a glowstick on the Sabbath, and advised me to throw the glowstick away, on the theory that kindling a new light on the Sabbath violated the applicable religious laws. I asked him what good throwing away the light would do, seeing as it had already been kindled and would keep on burning its fixed supply of fuel no matter where I put it. He said that even though throwing away the light wouldn’t stop the light from having been kindled (there were limits to his magical thinking, despite his religious convictions), it would highlight (har har) my agreement with the principle that kindling lights is wrong and make it easier not to do it again next time...the very sense that it is strange to throw away a lit glowstick helps put cognitive dissonance to work for changing your mind instead of against it: if you didn’t strongly believe in the importance of not kindling glowsticks, why on earth would you have thrown it away? But you did throw it away, and so you must believe, and so on. Also, not reaping the benefits of the wrongly kindled light makes kindling lights seem to provide fewer benefits, and makes it easier to resist kindling it the next time—if you know, in the moment of temptation, that even if you kindle the glowstick you might repent and not let yourself enjoy its light, you’ll factor that into your utility function and be more likely to decide that the no-longer-certain future benefit of the light isn’t worth the immediate guilt.

Anyway, this is a fairly weird example; I certainly don’t care whether people light glowsticks, on a particular tribe’s Sabbath or otherwise. I think it probably does help, though, to be a bit of a drama queen. If you buy a cake while you’re dieting, don’t just resolve not to eat it; physically throw it out the second-story balcony. If you’ve just admitted to yourself that your erstwhile political enemies actually have some pretty good points, write your favorite ex-evil candidate a letter of support or a $5 check and physically put it in the mail. As much as possible, bring your whole self into the process of changing your actions.

3) Over-correct your opinion by reading propaganda. Propaganda is dangerous when you read it in order to help you form an opinion, and a deontological evil when you publish it to hack into other peoples’ minds (which, depending on circumstances and your philosophy, may or may not be justified by the good consequences that you expect will follow). But when you’ve already carefully considered the rational evidence for and against a proposition, and you feel like you’ve changed your mind, and yet you’re still acting as if you hadn’t changed your mind, propaganda might be just what you need. Read an essay that forcefully argues for a position even more extreme than the one you’ve just adopted, even if the essay is full of logical cul-de-sacs. In this limited context alone, gleefully give yourself temporary permission to ignore the fact that reading the essay makes you notice that you are confused. Bask in the rightness of the essay and the guilt/​shame/​foolishness/​low-status that people who disagree with it should feel. If you gauge the dosage correctly, the propaganda might nudge your opinion just enough to make you actually adopt the new action that you felt would adequately reflect your new beliefs, but not enough to drive you over the cliff into madness.

As an example, I recently became convinced that eating industrially raised animals while living in San Francisco before the apocalypse can’t ever be morally justified, and, yet, lo and behold, I still ate turkey sandwiches at Subway 5 times a week. Obviously I could have just used some of the tactics from the Akrasia Review to make eating less factory-meat a conscious goal...but I’m busy using those tools for other goals, and I think that there are probably at least some contexts in which willpower is limited, or at least a variable-sum game. So I read Peter Singer’s book on Animal Liberation, and blamed all the woes of the world on steak for a few hours while slowly eating a particularly foul-tasting beef stew that was ruined by some Thai hole-in-the-wall, to reinforce the message. I’m doing a little bit better...I’m more likely to cross the street and eat vegetarian or pay for the free-range stuff, and I’m down to about 3 Subway footlongs a week, without any noticeable decrease in my willpower reserves.

Your mileage may vary; please use this tactic carefully.

4) Your suggestions.

Seriously; most of my point in posting here is to gather more suggestions. If I thought of the three best solutions in two hours with no training, I’ll eat my shirt. And I will, too—it’ll help me repent.