The “second horn” seems to be phrased incorrectly. It says:
“you can coherently anticipate winning the lottery after five seconds, anticipate the experience of having lost the lottery after fifteen seconds, and anticipate that once you experience winning the lottery you will experience having still won it ten seconds later.”
That’s not really right—the fate of most of those agents that experience a win of the lottery is to be snuffed out of existence. They don’t actually win the lottery—and they don’t experience having won it eleven seconds later either. The chances of the lottery staying won after it has been experienced as being won are slender.
Either that “horn” needs rephrasing—or another “horn” needs to be created with the correct answer on it.
If I understand the proposed merging procedure correctly, the procedure treats the trillion observers who experience a win of the lottery symmetrically. None of them are “snuffed” any more than any other. For each of the observers, there is a continuous space-time-causality “worm” connecting to the future self who spends the money.
This space-time-causality worm is supposed to be as analogous as possible to the one that connects any ordinary moment in your life to your future self. The difference is that this one merges (symmetrically) with almost a trillion others, all identical.
I see, I think. I can’t help wondering what the merge procedure does with any flipped bits in the diff, though. Anyway, horn 2 now seems OK—I think it describes the situation.
Rereading the comments on this thread, the problem is more subtle than I had thought—and I had better retract the above comment. I am inclined towards the idea that copying doesn’t really alter the pattern—but that kind of anthropic reasoning seems challenging to properly formalise under the given circumstances.
The “second horn” seems to be phrased incorrectly. It says:
“you can coherently anticipate winning the lottery after five seconds, anticipate the experience of having lost the lottery after fifteen seconds, and anticipate that once you experience winning the lottery you will experience having still won it ten seconds later.”
That’s not really right—the fate of most of those agents that experience a win of the lottery is to be snuffed out of existence. They don’t actually win the lottery—and they don’t experience having won it eleven seconds later either. The chances of the lottery staying won after it has been experienced as being won are slender.
Either that “horn” needs rephrasing—or another “horn” needs to be created with the correct answer on it.
If I understand the proposed merging procedure correctly, the procedure treats the trillion observers who experience a win of the lottery symmetrically. None of them are “snuffed” any more than any other. For each of the observers, there is a continuous space-time-causality “worm” connecting to the future self who spends the money.
This space-time-causality worm is supposed to be as analogous as possible to the one that connects any ordinary moment in your life to your future self. The difference is that this one merges (symmetrically) with almost a trillion others, all identical.
I see, I think. I can’t help wondering what the merge procedure does with any flipped bits in the diff, though. Anyway, horn 2 now seems OK—I think it describes the situation.
Rereading the comments on this thread, the problem is more subtle than I had thought—and I had better retract the above comment. I am inclined towards the idea that copying doesn’t really alter the pattern—but that kind of anthropic reasoning seems challenging to properly formalise under the given circumstances.
Yup! If you can’t do the merge without killing people, then the trilemma is dissolved.