The value of Now.

I am an eas­ily bored Omega-level be­ing, and I want to play a game with you.

I am go­ing to offer you two choices.

Choice 1: You spend the next thou­sand years in hor­rific tor­ture, af­ter which I re­store your lo­cal uni­verse to pre­cisely the state it is in now (wiping your mem­ory in the pro­cess), and hand you a box with a billion dol­lars in it.

Choice two: You spend the next thou­sand years in exquisite bliss, af­ter which I re­store your lo­cal uni­verse to pre­cisely the state it is in now (wiping your mem­ory in the pro­cess), and hand you a box with an an­gry hor­net’s nest in it.

Which do you choose?

Now, you blink. I smile and in­form you that you made your choice, and hand you your box. Which choice do you hope you made?

You ob­ject? Fine. Let’s play an­other game.

I am go­ing to offer you two choices.

Choice 1: I cre­ate a perfect simu­la­tion of you, and run it through a thou­sand simu­lated years of hor­rific tor­ture (which will take my hy­per­com­puter all of a billionth of a sec­ond to run), af­ter which I delete the simu­la­tion and hand you a box with a billion dol­lars in it.

Choice 2: I cre­ate a perfect simu­la­tion of you, and run it through a thou­sand simu­lated years of exquisite bliss (which will take my hy­per­com­puter all of a billionth of a sec­ond to run), af­ter which I delete the simu­la­tion and hand you a box with an an­gry hor­net’s nest in it.

Which do you choose?

Now, I smile and in­form you that I already made a perfect simu­la­tion of you and asked it that ques­tion. Which choice do you hope it made?

Let’s ex­pand on that. What if in­stead of cre­at­ing one perfect simu­la­tion of you, I cre­ate 2^^^^3 perfect simu­la­tions of you? Which do you choose now?

What if in­stead of a thou­sand simu­lated years, I let the boxes run for 2^^^^3 simu­lated years each? Which do you choose now?

I have the box right here. Which do you hope you chose?