This second puzzle is a formalization of a Prisoner’s Dilemma strategy proposed by Eliezer: “I cooperate if and only I expect you to cooperate if and only if I cooperate”. So far we only knew how to make this strategy work by “reasoning from symmetry”, also known as quining. But programs A and B can be very different—a human-created AI versus an extraterrestrial crystalloid AI. Will they cooperate?
And interestingly, posters here only started criticizing this game-theoretic reasoning when I used it. c=/
And interestingly, posters here only started criticizing this game-theoretic reasoning when I used it. c=/
Why?
Your original formulation did not include the “I expect you to” part.