I don’t mean that the probability is always 50⁄50. But it’s not 100% either.
In Europe, the smartest people for centuries believed in god, and they saw endless confirmations of that belief. And then—bam! It turned out they were simply all wrong.
Or take any case of ancient medicine. European doctors believed for centuries that bloodletting cured everything, while Chinese doctors believed that eating lead prolonged life. There are also other examples where all the experts were wrong: geocentrism, the ether theory, the idea that mice spontaneously generate in dirty laundry, the miasma theory of disease…
In all these cases it was either about cognitive biases (God, medicine) or about lack of information or broken public discussion (geocentrism).
Today we fight biases much better than a thousand years ago, but we’re still far from perfect.
And we still sometimes operate under very limited information.
I think one should have fundamental rational habits that would protect me from being so sure in god or bloodletting. That’s why, from any conclusion I make, I subtract a few percentage points of confidence. The more complex the conclusion, the more speculative my reasoning or vulnerable to diases, the more I subtract.
If you claim that my way of fighting this overconfidence shouldn’t be used, I’d want you to suggest something else instead. Because you can’t just leave it as it is—otherwise one might assign 99% confidence to some nonsense.
I don’t mean that the probability is always 50⁄50. But it’s not 100% either.
In Europe, the smartest people for centuries believed in god, and they saw endless confirmations of that belief. And then—bam! It turned out they were simply all wrong.
Or take any case of ancient medicine. European doctors believed for centuries that bloodletting cured everything, while Chinese doctors believed that eating lead prolonged life.
There are also other examples where all the experts were wrong: geocentrism, the ether theory, the idea that mice spontaneously generate in dirty laundry, the miasma theory of disease…
In all these cases it was either about cognitive biases (God, medicine) or about lack of information or broken public discussion (geocentrism).
Today we fight biases much better than a thousand years ago, but we’re still far from perfect.
And we still sometimes operate under very limited information.
I think one should have fundamental rational habits that would protect me from being so sure in god or bloodletting. That’s why, from any conclusion I make, I subtract a few percentage points of confidence. The more complex the conclusion, the more speculative my reasoning or vulnerable to diases, the more I subtract.
If you claim that my way of fighting this overconfidence shouldn’t be used, I’d want you to suggest something else instead. Because you can’t just leave it as it is—otherwise one might assign 99% confidence to some nonsense.