Imagine you are absolutely certain you will cooperate and that your clone will cooperate. You are still capable of asking “what would my payoff be if I didn’t cooperate” and this payoff will be the payoff if you defect and the clone cooperates since you expect the clone to do whatever you will do and you expect to cooperate. There is no reason to update my belief on what the clone will do in this thought experiment since the thought experiment is about a zero probability event.
The psychopath case is different because I have uncertainty regarding whether I am a psychopath and the choice I want to make helps me learn about myself. I have no uncertainty concerning my clone.
There is no reason to update my belief on what the clone will do in this thought experiment since the thought experiment is about a zero probability event.
You are reasoning about an impossible scenario; if the probability of you reaching the event is 0, the probability of your clone reaching it is also 0. In order to make it a sensical notion, you have to consider it as epsilon probabilities; since the probability will be the same for both your and your clone, this gets you
%5E2%20+%20300%20\epsilon%20\cdot%20(1-\epsilon)%20+%20100%20\epsilon%5E2%20=%20200%20-%20400\epsilon%20+%20200\epsilon%5E2%20+%20300\epsilon%20-300%20\epsilon%5E2%20+%20100%20\epsilon%5E2%20=%20200%20-100\epsilon), which is maximized when epsilon=0.
To claim that you and your clone could take different actions is trying to make it a question about trembling-hand equilibria, which violates the basic assumptions of the game.
It’s common in game theory to consider off the equilibrium path situations that will occur with probability zero without taking a trembling hand approach.
I agree with your first paragraph.
Imagine you are absolutely certain you will cooperate and that your clone will cooperate. You are still capable of asking “what would my payoff be if I didn’t cooperate” and this payoff will be the payoff if you defect and the clone cooperates since you expect the clone to do whatever you will do and you expect to cooperate. There is no reason to update my belief on what the clone will do in this thought experiment since the thought experiment is about a zero probability event.
The psychopath case is different because I have uncertainty regarding whether I am a psychopath and the choice I want to make helps me learn about myself. I have no uncertainty concerning my clone.
You are reasoning about an impossible scenario; if the probability of you reaching the event is 0, the probability of your clone reaching it is also 0. In order to make it a sensical notion, you have to consider it as epsilon probabilities; since the probability will be the same for both your and your clone, this gets you
%5E2%20+%20300%20\epsilon%20\cdot%20(1-\epsilon)%20+%20100%20\epsilon%5E2%20=%20200%20-%20400\epsilon%20+%20200\epsilon%5E2%20+%20300\epsilon%20-300%20\epsilon%5E2%20+%20100%20\epsilon%5E2%20=%20200%20-100\epsilon), which is maximized when epsilon=0.To claim that you and your clone could take different actions is trying to make it a question about trembling-hand equilibria, which violates the basic assumptions of the game.
It’s common in game theory to consider off the equilibrium path situations that will occur with probability zero without taking a trembling hand approach.