Trust in Math

Followup to: Expecting Beauty

I was once reading a Robert Heinlein story—sadly I neglected to note down which story, but I do think it was a Heinlein—where one of the characters says something like, “Logic is a fine thing, but I have seen a perfectly logical proof that 2 = 1.” Authors are not to be confused with characters, but the line is voiced by one of Heinlein’s trustworthy father figures. I find myself worried that Heinlein may have meant it.

The classic proof that 2 = 1 runs thus. First, let x = y = 1. Then:

  1. x = y

  2. x2 = xy

  3. x2 - y2 = xy—y2

  4. (x + y)(x—y) = y(x—y)

  5. x + y = y

  6. 2 = 1

Now, you could look at that, and shrug, and say, “Well, logic doesn’t always work.”

Or, if you felt that math had rightfully earned just a bit more credibility than that, over the last thirty thousand years, then you might suspect the flaw lay in your use of math, rather than Math Itself.

You might suspect that the proof was not, in fact, “perfectly logical”.

The novice goes astray and says: “The Art failed me.”
The master goes astray and says: “I failed my Art.”

Is this—gasp! - faith? To believe that math is consistent, when you have seen with your own eyes a proof that it is not? Are you supposed to just ignore the contrary evidence, my good Bayesian?

As I have remarked before, it seems worthwhile to distinguish “faith” that the sun will rise in the east just like the last hundred thousand times observed, from “faith” that tomorrow a green goblin will give you a bag of gold doubloons. When first-order arithmetic has been observed to be internally consistent over the last ten million theorems proved in it, and you see a seeming proof of inconsistency, it is, perhaps, reasonable to double-check the proof.

You’re not going to ignore the contrary evidence. You’re going to double-check it. You’re going to also take into account the last ten million times that first-order arithmetic has proven consistent, when you evaluate your new posterior confidence that 2 = 1 is not perfectly logical. On that basis, you are going to evaluate a high probability that, if you check for a flaw, you are likely to find one.

But isn’t this motivated skepticism? The most fearful bane of students of bias? You’re applying a stronger standard of checking to incongruent evidence than congruent evidence?

Yes. So it is necessary to be careful around this sort of reasoning, because it can induce belief hysteresis—a case where your final beliefs end up determined by the order in which you see the evidence. When you add decision theory, unlike the case of pure probability theory, you have to decide whether to take costly actions to look for additional evidence, and you will do this based on the evidence you have seen so far.

Perhaps you should think to yourself, “Huh, if I didn’t spot this flaw at first sight, then I may have accepted some flawed congruent evidence too. What other mistaken proofs do I have in my head, whose absurdity is not at first apparent?” Maybe you should apply stronger scrutiny to the next piece of congruent evidence you hear, just to balance things out.

Real faith, blind faith, would be if you looked at the proof and shrugged and said, “Seems like a valid proof to me, but I don’t care, I believe in math.” That would be discarding the evidence.

You have a doubt. Move to resolve it. That is the purpose of a doubt. After all, if the proof does hold up, you will have to discard first-order arithmetic. It’s not acceptable to be walking around with your mind containing both the belief that arithmetic is consistent, and what seems like a valid proof that 2 = 1.

Oh, and the flaw in the proof? Simple technique for finding it: Substitute 1 for both x and y, concretely evaluate the arithmetic on both sides of the equation, and find the first line where a true equation is followed by a false equation. Whatever step was performed between those two equations, must have been illegal—illegal for some general reason, mind you; not illegal just because it led to a conclusion you don’t like.

That’s what Heinlein should have looked for—if, perhaps, he’d had a bit more faith in algebra.


Added: Andrew2 says the character was Jubal from Stranger in a Strange Land.

Charlie says that Heinlein did graduate work in math at UCLA and was a hardcore formalist. I guess either Jubal wasn’t expressing an authorial opinion, or Heinlein meant to convey “deceptively logical-seeming” by the phrase “perfectly logical”.

If you don’t already know the flaw in the algebra, there are spoilers in the comments ahead.