I agree that any discussion of god-related topics might take several times longer, since you’d have to go into cognitive biases. You’d probably need to explain Bayesianism—or even argue for it—before you could move on. In the worst case, you’d have to drop them Sequences: highlighted. Okay, they won’t read it, because it’s hundreds of pages long, and because Eliezer constantly speaks out against religion, so believers wouldn’t enjoy reading it anyway. Right, that would take an absurd amount of time.
Still, I personally only estimate the probability that creationists are wrong at about 80%, simply because I haven’t really looked into their line of argumentation, and I’ve never even debated a believer seriously. Intuitively, it feels absurd to deny something without really understanding what exactly it is you’re denying.
I agree that any discussion of god-related topics might take several times longer, since you’d have to go into cognitive biases.
Okay then, let’s use homeopathy as an example. I can fairly and honestly say that my position—which is that homeopathy is crap—is basically 100% correct. Or Holocaust deniers. I can fairly and honestly say that my position—which is that the Holocaust was real—is basically 100% correct.
Saying “everyone’s human, every side has smart people on it, so the sides are 50% correct” doesn’t work. Holocaust deniers are certainly human, and they’re not stupid. But they and I are not equally correct.
(I’d also ask, if you’re going to exclude god-related topics because of cognitive biases, how is that not special pleading? In other contexts, you reject the idea of saying “my political opponents have cognitive biases”. After all we’re all human, all sides have smart people, etc.)
I don’t mean that the probability is always 50⁄50. But it’s not 100% either.
In Europe, the smartest people for centuries believed in god, and they saw endless confirmations of that belief. And then—bam! It turned out they were simply all wrong.
Or take any case of ancient medicine. European doctors believed for centuries that bloodletting cured everything, while Chinese doctors believed that eating lead prolonged life. There are also other examples where all the experts were wrong: geocentrism, the ether theory, the idea that mice spontaneously generate in dirty laundry, the miasma theory of disease…
In all these cases it was either about cognitive biases (God, medicine) or about lack of information or broken public discussion (geocentrism).
Today we fight biases much better than a thousand years ago, but we’re still far from perfect.
And we still sometimes operate under very limited information.
I think one should have fundamental rational habits that would protect me from being so sure in god or bloodletting. That’s why, from any conclusion I make, I subtract a few percentage points of confidence. The more complex the conclusion, the more speculative my reasoning or vulnerable to diases, the more I subtract.
If you claim that my way of fighting this overconfidence shouldn’t be used, I’d want you to suggest something else instead. Because you can’t just leave it as it is—otherwise one might assign 99% confidence to some nonsense.
it feels absurd to deny something without really understanding what exactly it is you’re denying.
Hypothesis space is so large that it’s absurd to give anything more than a passing glance if it doesn’t make sense and nobody can convince you in a fairly short period that it’s at least worth exploring.
I guess you can fall on radical agnosticism, but it’s hard to recommend any policy or decision based on not knowing anything.
I agree that any discussion of god-related topics might take several times longer, since you’d have to go into cognitive biases. You’d probably need to explain Bayesianism—or even argue for it—before you could move on. In the worst case, you’d have to drop them Sequences: highlighted. Okay, they won’t read it, because it’s hundreds of pages long, and because Eliezer constantly speaks out against religion, so believers wouldn’t enjoy reading it anyway.
Right, that would take an absurd amount of time.
Still, I personally only estimate the probability that creationists are wrong at about 80%, simply because I haven’t really looked into their line of argumentation, and I’ve never even debated a believer seriously. Intuitively, it feels absurd to deny something without really understanding what exactly it is you’re denying.
Okay then, let’s use homeopathy as an example. I can fairly and honestly say that my position—which is that homeopathy is crap—is basically 100% correct. Or Holocaust deniers. I can fairly and honestly say that my position—which is that the Holocaust was real—is basically 100% correct.
Saying “everyone’s human, every side has smart people on it, so the sides are 50% correct” doesn’t work. Holocaust deniers are certainly human, and they’re not stupid. But they and I are not equally correct.
(I’d also ask, if you’re going to exclude god-related topics because of cognitive biases, how is that not special pleading? In other contexts, you reject the idea of saying “my political opponents have cognitive biases”. After all we’re all human, all sides have smart people, etc.)
I don’t mean that the probability is always 50⁄50. But it’s not 100% either.
In Europe, the smartest people for centuries believed in god, and they saw endless confirmations of that belief. And then—bam! It turned out they were simply all wrong.
Or take any case of ancient medicine. European doctors believed for centuries that bloodletting cured everything, while Chinese doctors believed that eating lead prolonged life.
There are also other examples where all the experts were wrong: geocentrism, the ether theory, the idea that mice spontaneously generate in dirty laundry, the miasma theory of disease…
In all these cases it was either about cognitive biases (God, medicine) or about lack of information or broken public discussion (geocentrism).
Today we fight biases much better than a thousand years ago, but we’re still far from perfect.
And we still sometimes operate under very limited information.
I think one should have fundamental rational habits that would protect me from being so sure in god or bloodletting. That’s why, from any conclusion I make, I subtract a few percentage points of confidence. The more complex the conclusion, the more speculative my reasoning or vulnerable to diases, the more I subtract.
If you claim that my way of fighting this overconfidence shouldn’t be used, I’d want you to suggest something else instead. Because you can’t just leave it as it is—otherwise one might assign 99% confidence to some nonsense.
Hypothesis space is so large that it’s absurd to give anything more than a passing glance if it doesn’t make sense and nobody can convince you in a fairly short period that it’s at least worth exploring.
I guess you can fall on radical agnosticism, but it’s hard to recommend any policy or decision based on not knowing anything.