In case it’s this ambiguity, MBlume’s strategy isn’t “cooperate in any scenario”
Ah. It did look to me as though he was suggesting that. For, after describing how we would try to convince the creationist to cooperate (by trying to convince them of their epistemic error), he writes:
But of course, you would fail. And the door would shut, and you would grit your teeth, and curse 2000 years of screamingly bad epistemic hygiene, and weep bitterly for the people who might die in a few hours because of your counterpart’s ignorance.
I read this as suggesting that we would fail to convince the creationist to cooperate. So we would weep for all the people that would die due to their defection. In that case, to suggest that we ought to co-operate nonetheless would seem futile in the extreme—hence my comment about merely adding to the reasons to weep.
But I take it your proposal is that MBlume meant something else: not that we would fail to convince the creationist to co-operate, but rather that we would fail to convince them to let us defect. That would make more sense. (But it is not at all clear from what he wrote.)
I read this as suggesting that we would fail to convince the creationist to cooperate. So we would weep for all the people that would die due to their defection.
I read it as saying that if the creationist could have been convinced of evolution, then 3 billion rather than 2 billion could have been saved; after the door shuts, MBlume then follows the policy of “both cooperate if we still disagree” that he and the creationist both signaled they were genuinely capable of.
(But it is not at all clear from what he wrote.)
I have to agree— MBlume, you should have written this post so that someone reading it on its own doesn’t get a false impression. It makes sense within the debate, and especially in context of your previous post, but is very ambiguous if it’s the first thing one reads.
There’s perhaps one more source of ambiguity: the distinction between
the assertion that “cooperate without communication, given only mutual knowledge of complete rationality in decision theory” is part of the completely rational decision theory, and
the discussion of “agree to mutually cooperate in such a fashion that you each unfakeably signal your sincerity” as a feasible PD strategy for quasi-rational human beings.
If all goes well, I’d like to post on this myself soon.
Ah. It did look to me as though he was suggesting that. For, after describing how we would try to convince the creationist to cooperate (by trying to convince them of their epistemic error), he writes:
I read this as suggesting that we would fail to convince the creationist to cooperate. So we would weep for all the people that would die due to their defection. In that case, to suggest that we ought to co-operate nonetheless would seem futile in the extreme—hence my comment about merely adding to the reasons to weep.
But I take it your proposal is that MBlume meant something else: not that we would fail to convince the creationist to co-operate, but rather that we would fail to convince them to let us defect. That would make more sense. (But it is not at all clear from what he wrote.)
I read it as saying that if the creationist could have been convinced of evolution, then 3 billion rather than 2 billion could have been saved; after the door shuts, MBlume then follows the policy of “both cooperate if we still disagree” that he and the creationist both signaled they were genuinely capable of.
I have to agree— MBlume, you should have written this post so that someone reading it on its own doesn’t get a false impression. It makes sense within the debate, and especially in context of your previous post, but is very ambiguous if it’s the first thing one reads.
There’s perhaps one more source of ambiguity: the distinction between
the assertion that “cooperate without communication, given only mutual knowledge of complete rationality in decision theory” is part of the completely rational decision theory, and
the discussion of “agree to mutually cooperate in such a fashion that you each unfakeably signal your sincerity” as a feasible PD strategy for quasi-rational human beings.
If all goes well, I’d like to post on this myself soon.