Was this motivated by Nick Bostrom’s paper? If not, you might enjoy it—it also discusses the idea of splitting a consciousness into copies by duplicating identical computational processes.
Yup. In short, I think there is a very simple place when a slowly-dividing computer should start acting as two agents, and that is when, in the sleeping beauty problem, I make two bets with them rather than one.
Hmm. If we use that sort of functional definition, does that mean that if I offer the thicker computer more money, that changes the amount of experience? Nope—it still has the correct betting behavior if it uses a probability of 0.5.
Was this motivated by Nick Bostrom’s paper? If not, you might enjoy it—it also discusses the idea of splitting a consciousness into copies by duplicating identical computational processes.
Yup. In short, I think there is a very simple place when a slowly-dividing computer should start acting as two agents, and that is when, in the sleeping beauty problem, I make two bets with them rather than one.
Hmm. If we use that sort of functional definition, does that mean that if I offer the thicker computer more money, that changes the amount of experience? Nope—it still has the correct betting behavior if it uses a probability of 0.5.