Living bias, not thinking bias

1. Biases, those traits which affect everyone but me

I recently had the opportunity to run an exercise on bias and rationality with a group of (fellow) university students. I wasn’t sure it was going to go down well. There’s one response that always haunts me when it comes to introducing bias: That’s an interesting description of other people but it doesn’t describe me.

I can’t remember the details (and haven’t been able to track them down), but I once read about an experiment on some bias, let’s say it was hindsight bias. The research team carried out a standard experiments which showed that the participants were biased as expected. After, they told these participants about hindsight bias. Most of the participants thought this was interesting and probably explained the actions of other people in the experiment but they didn’t think it explained their own actions.

So going into the presentation, this is what I was worried about: People thinking these biases were just abstract and didn’t affect them.

Then at the end, everyone’s comments made it clear that this wasn’t the case. They really had realised that these were biases which affected them. The question then is, what led them to reach this conclusion?

2. Living history, living bias

All of the other planets (and the Earth) orbit the Sun. Once upon a time, we didn’t believe this: We thought that the these planets (and the Sun) orbited the Earth.

Imagine that you’re alive all that time ago, when the balance of evidence has just swung so that it favours the theory that the planets orbit the Sun. However, at the time, you steadfastly insist that they orbit the Earth. Why? Because your father told you it did when you were a child and you always believe things your father told you. Then a friend explains all of the evidence in favour of the theory that the planets orbit the Sun. Eventually, you realise that you were mistaken all along and, at the same time, you realise something else: You realise that it was a mistake not to question a belief just because your father endorsed it.

If you think about history, you learn what beliefs were wrong. If you live history, you learn this and then you also learn what it feels like to mistakenly endorse an incorrect belief. Maybe next time it occurs then, you can avoid making the same error.

In teaching people about biases, I think its best to help students to live biases and not just think about them. That way, they’ll know what it feels like to be biased and they’ll know that they are biased.

3. Rationality puzzles

One of the best ways to do this, and the technique I used in my presentation, seems to be to use of rationality puzzles. Basically, these are puzzles where the majority of respondents tend to reason in a biased or fallacious way. Run a few of these puzzles and most students will reason incorrectly in at least one of them. This means them a chance to experience being biased. If lessons focused on an abstract presentation biases instead, the student would think about the bias but not live it in the same way.

So on example rationality puzzle is the 2, 4, 6 task. When I ran this exercise for my presentation, I broke the group up into pairs and made one member of each pair the questioner and the other the respondent.

The respondent was given a slip of paper containing a number rule written upon it. This was a rule that a sequence of three numbers could either meet or fail to meet. I won’t mention what the rule was yet, to give those who haven’t come across the puzzle a chance to think about how they would proceed.

The questioner’s job was to guess this rule. They were given one clue: The sequence 2, 4, 6 met the rule. The questioner was then allowed to ask whether other three number sequences met the rule and the respondent would let them know if it did. The questioner could ask about as many sequences as they wanted to and when they were confident they were to write their guess down (I limited the exercise to five minutes for practical purposes and everyone had written down an answer by then).

The answer was: Any three numbers in ascending order. No students in the group got the right answer.

I then used the exercise to explain a bias called positive bias. First, I noted that only 21% of respondents reached the right answer to this scenario. Then I pointed out that the interesting point isn’t this figure but rather why so few people reach the right answer. Specifically, people think to test positive, rather than negative, cases. In other words, they’re more likely to test cases that their theory predicts will occur (in this case, those that get a yes answer) then cases that their theory predicts won’t. So if someone’s initial theory was that the rule was, “three numbers, each two higher than the previous one” then they might test “10, 12, 14“ as this is a positive case for their theory. On the other hand, they probably wouldn’t test “10, 14, 12” or “10, 13, 14” as these are negative cases for their prediction of the rule.

This demonstrates positive bias—the bias toward thinking to test positive, rather then negative, cases for their theory (see here for previous discussion of the 2, 4, 6 task on Less Wrong).

Puzzles like this allow the student to live the bias and not just consider it on an abstract level.

4. Conclusion

In teaching people about biases we should be trying to make them live biases, rather than just thinking about them. Rationality puzzles offer one of the best ways to achieve this.

Of course, for any individual puzzle, some people will get the right answer. With the 2, 4, 6 puzzle in particular, a number of us have found people perform better on this task in casual, rather than formal, settings. The best way to deal with this is to present a series of puzzles that reveal a variety of different biases. Most people will reach the wrong answer in at least one puzzle.

5. More rationality puzzles

Bill the accountant and the conjunction fallacy

Wason Selection Task

World War II and Selection Effects (not quite a puzzle yet, but it feels like it could be made into one)