I read this as suggesting that we would fail to convince the creationist to cooperate. So we would weep for all the people that would die due to their defection.
I read it as saying that if the creationist could have been convinced of evolution, then 3 billion rather than 2 billion could have been saved; after the door shuts, MBlume then follows the policy of “both cooperate if we still disagree” that he and the creationist both signaled they were genuinely capable of.
(But it is not at all clear from what he wrote.)
I have to agree— MBlume, you should have written this post so that someone reading it on its own doesn’t get a false impression. It makes sense within the debate, and especially in context of your previous post, but is very ambiguous if it’s the first thing one reads.
There’s perhaps one more source of ambiguity: the distinction between
the assertion that “cooperate without communication, given only mutual knowledge of complete rationality in decision theory” is part of the completely rational decision theory, and
the discussion of “agree to mutually cooperate in such a fashion that you each unfakeably signal your sincerity” as a feasible PD strategy for quasi-rational human beings.
If all goes well, I’d like to post on this myself soon.
I read it as saying that if the creationist could have been convinced of evolution, then 3 billion rather than 2 billion could have been saved; after the door shuts, MBlume then follows the policy of “both cooperate if we still disagree” that he and the creationist both signaled they were genuinely capable of.
I have to agree— MBlume, you should have written this post so that someone reading it on its own doesn’t get a false impression. It makes sense within the debate, and especially in context of your previous post, but is very ambiguous if it’s the first thing one reads.
There’s perhaps one more source of ambiguity: the distinction between
the assertion that “cooperate without communication, given only mutual knowledge of complete rationality in decision theory” is part of the completely rational decision theory, and
the discussion of “agree to mutually cooperate in such a fashion that you each unfakeably signal your sincerity” as a feasible PD strategy for quasi-rational human beings.
If all goes well, I’d like to post on this myself soon.