This is actually kind of interesting. The only thing that makes me consider picking choice one is the prospect of donating the billion dollars to charity and saving countless lives, but I know that’s not really the point of the thought experiment. So, yeah, I’d choose choice two.
But the interesting thing is that, intuitively, at least, choosing choice 2 in the first game seems much more obvious to me. It doesn’t seem rational to me to care if a simulation of you is tortured any more than you would a simulation of someone else. Either way, you wouldn’t actually ever have to experience it. The empathy factor might be stronger if it’s a copy of you—“oh shit, that guy is being tortured!” vs. “oh, shit, that guy that looks and acts a lot like me in every single way is being tortured!”, but this is hardly rational. Of course, the simulated me has my memories, so he perceives an unbroken stream of consciousness flowing from making the decision into the thousand years of torture, but who cares. That’s still some other dude experiencing it, not me.
So, yes, it seems strange to consider the memory loss case any differently. At least I cannot think of a justification for this feeling. This leads me to believe that the choice is a purely altruistic decision, i.e. it’s equivalent to omega saying “I’ll give you a billion dollars if you let me torture this dude for 1000 years”. In that case, I would have to evaluate whether or not a billion-dollar dent in world hunger is worth 1000 years of torture for some guy (probably not) and then make my decision.
This is actually kind of interesting. The only thing that makes me consider picking choice one is the prospect of donating the billion dollars to charity and saving countless lives, but I know that’s not really the point of the thought experiment. So, yeah, I’d choose choice two.
But the interesting thing is that, intuitively, at least, choosing choice 2 in the first game seems much more obvious to me. It doesn’t seem rational to me to care if a simulation of you is tortured any more than you would a simulation of someone else. Either way, you wouldn’t actually ever have to experience it. The empathy factor might be stronger if it’s a copy of you—“oh shit, that guy is being tortured!” vs. “oh, shit, that guy that looks and acts a lot like me in every single way is being tortured!”, but this is hardly rational. Of course, the simulated me has my memories, so he perceives an unbroken stream of consciousness flowing from making the decision into the thousand years of torture, but who cares. That’s still some other dude experiencing it, not me.
So, yes, it seems strange to consider the memory loss case any differently. At least I cannot think of a justification for this feeling. This leads me to believe that the choice is a purely altruistic decision, i.e. it’s equivalent to omega saying “I’ll give you a billion dollars if you let me torture this dude for 1000 years”. In that case, I would have to evaluate whether or not a billion-dollar dent in world hunger is worth 1000 years of torture for some guy (probably not) and then make my decision.