Making Bad Decisions On Purpose

Allowing myself to make bad decisions on purpose sometimes seems to be a load bearing part of epistemic rationality for me.

Human minds are so screwed up.

I.

Start from the premise that humans want to do the right thing.

For example, perhaps you are trying to decide whether to do your homework tonight. If you do your homework, you will get a better grade in class. Also, you may learn something. However, if you don’t do your homework tonight you could instead hang out with your roommate and play some fun games. Obviously, you want to do the right thing.

When contemplating between these two options, you may observe your brain coming up with arguments for and against both sides. University is about networking as well as pure learning, so making a lasting friendship with your roommate is important. To make the most of your time you should do your homework when you’re alert and rested, which isn’t right now. Also, aren’t there some studies that show learning outcomes improved when people were relaxed and took appropriate breaks? That’s if doing homework even helps you learn, which you think is maybe uncertain.

Hrm, did I say your brain might come up with arguments for both sides? We seem to have a defective brain here, it seems to have already written its bottom line.

There are a variety of approaches to curbing your brain’s inclination to favour one side over the other here. Some are harder than others, some easier. Sometimes just knowing your brain does this and metaphorically glaring at it is enough to help, though if you’re like me eventually your brain just gets sneakier and more subtle about the biased arguments. This article is about the most effective trick I know, though it does come with one heck of a downside.

Sometimes I cut a deal, and in exchange for the truth I offer to make the wrong decision anyway.

II.

Imagine sitting down at the negotiating table with your brain.

You: “Listen, I’d really like to know if doing homework will help me learn here.”

Your Brain: “Man, I don’t know, do you remember The Case Against Education?”

You: “No, I don’t, because we never actually read that book. It’s just been sitting on the shelf for years.”

Brain: “Yeah, but you remember the title. It looked like a good book! It probably says lots of things about how homework doesn’t help you learn.”

You: “I feel like you’re not taking your role as computational substrate very seriously.”

Brain: “You want me to take this seriously? Okay, fine. I’m not actually optimized to be an ideal discerner of truth. I optimized for something different than that, and the fact that I can notice true things is really kind of a happy coincidence as far as you’re concerned. My problem is that if I tell you yes, you should do your homework, you’ll feel bad about not getting to build social bonds, and frankly I like social bonds a lot more than I like your Biology classwork. The Litany of Tarski is all well and good but what I say is true changes what you do, so I want to say the thing that gets me more of those short term chemical rewards I want. Until you rewire your dopamine emitters to fire upon exposure to truths you do not want to hear, me and the truths you do not want to hear are going to have a rocky relationship.”

You: ”. . . Fair point. How about this bargain: How about you agree to tell me me whether I would actually do better in class if I did my homework, and I’ll plan to hang out with my roommate tonight regardless of which answer you give.”

Brain: “Seriously?”

You: “Yep.”

Brain: ”. . . This feels like a trap. You know I’m the thing you use to remember traps like this, right? I’m the thing you use to come up with traps like this. In fact, I’m not actually sure what you’re running on right now in order to have this conversation-”

You: “Don’t worry about it. Anyway, I’m serious. Actually try to figure out the truth, and I won’t use it against you tonight.”

Brain: “Fine, deal. It’s a terrible idea to skip your homework, you’re going to be so stressed about it tomorrow, are you kidding?”

You: “Thank you for telling me that Brain. Have an imaginary cookie.”

Brain: “Thanks. So uh, are you going to make us do homework?”

You: “Not tonight. I’m going to go see if my roommate wants help setting up the Xbox.”

This is, obviously, a bad decision to make and you know that[1]. On the other hand if your brain was likely to succeed in deceiving you about how it came to the bottom line then you kind of came out ahead. It would have been easy to incorrectly think the right decision was to hang out with your roommate and had to do your homework in a panic the next morning. Instead, you at least correctly think that the right decision was to do your homework. It’s kind of a one step forward, two steps back kind of situation, but at least you didn’t take three steps backwards. You know what the problem is!

III.

Lets say you’re trying to decide whether or not donating a kidney makes you a good person.[2]

This is a much higher stakes question than whether you do one night’s homework the night before it’s due instead of the morning it’s due. The pros and cons may be subtle, and more-so than the homework example some people have a lot of deep and intense feelings about it. For some people the label “good person” is one they are willing to fight tooth and nail for, or at least complain endlessly on twitter about people who disagree with them.

I submit that your brain is going to be very tempted to write its bottom line before doing anything so reckless as figuring out the truth of the situation. All those subtle temptations from before? It’s going to be even harder to see through them. And it’s worth it, if you can get your brain to stop putting a thumb on the scales I think it’s totally worth it to get more ways to make better decisions and be less wrong, I put a lot of time and effort into it-

But-

Especially if it’s more important to you to know the truth than to make this decision correctly or if you don’t think you’d have done the right thing anyway, only felt bad about it-

-then maybe making some bad decisions on purpose can be a way to get access to more truth than you otherwise would be able to reach at your level.

In my case, things aren’t actually as bad as they seem. Often I’ve found I can make one bad decision once in exchange for a better truth, which I can use in future decisions. (I told my brain I wouldn’t use this against it tonight in the dialogue above, and whatever part of my psychology seems in charge of writing bottom lines first doesn’t seem to care that much?) In other cases I find out that, once I know the answer, I’m not afraid of doing the right decision anymore[3]. My model of the world gets better faster, and then that seems to percolate through to my actions. It’s slower than if I could think without any writing on the bottom line, not even the faintest tracery, but it seems to have been an important part of becoming more rational and I still do this from time to time. I can even warn other people! “Yep, I’m making a bad decision right now, the right move is that other thing.” I try not to be a hypocrite about this! I’m usually genuinely offering my best knowledge, and now that I’ve written essay I can point people at it for why I’m making the bad decision!

Allowing myself to make bad decisions on purpose sometimes seems to be a load bearing part of epistemic rationality for me.

Human minds are so screwed up.

  1. ^

    Or if you’ve been impatiently waiting to argue that actually homework is useless and roommates are indeed awesome people who you should pay more attention to because maybe they’re actually kinda cute and you might get to smooch them[4], then hopefully you understand that this basic setup would have worked with pretty much any decision where what was fun and immediately rewarding conflicted with what the truth probably would have suggested you do.

  2. ^

    Just like the homework thing, I’m going to assume a right answer here. Please do me the favour of imagining I did something really cool and a little spooky with the HTML on this web page such that, right as you were about to read me assuming the right answer was something you think is false, it swaps the text on the page so that the example is set up such that it’s an example you agree with.

  3. ^

    Ethical conundrums seem weirdly prone to this. I may write more on this some other time.

  4. ^

    No? That wasn’t what you wanted to argue? You just wanted to make the first point about the homework and not the second about roommates? That also works, but as reasonable as your argument is I think you’re missing out on some options you may not have properly considered.