Game 1: I take the second option. I want 1000 years of exquisite bliss much more than I don’t want to have a box of hornets in my hand.
Game 2: First option. I value perfect simulations of myself none at all, and a billion dollars is pretty sick.
I have no preference regarding what choices perfect simulations of me would choose, since I don’t care about them at all, though I would assume that they make the same choices I would make since they have the same values.
How does increasing the amount or length of time change the question?
Game 1: I take the second option. I want 1000 years of exquisite bliss much more than I don’t want to have a box of hornets in my hand.
Game 2: First option. I value perfect simulations of myself none at all, and a billion dollars is pretty sick.
I have no preference regarding what choices perfect simulations of me would choose, since I don’t care about them at all, though I would assume that they make the same choices I would make since they have the same values.
How does increasing the amount or length of time change the question?
What in this post merrited downvoting without explaining?
(This was at −1 when I found it)
Probably because it’s thinking is sufficiently far away from standard Less Wrong computationalism as to seem stupid to someone.