What happens to all the other people in the worlds the Waker leaves? I think I see five possibilities, all of which seem dreadful in one way or another.
Their life goes on but the Waker has mysteriously vanished from it. In this case, a world with Wakers in sounds pretty bad. No one could rely on anything involving other people. (Unless Wakers chose to “wake up” extremely infrequently; say, only in circumstances like those in which someone in our world might choose suicide. But it seems clear that this isn’t what you have in mind.)
They are discarded, no longer being required. So we have, potentially, billions of people who exist only for the amusement of this Waker, and get turned off when the Waker loses interest. That seems morally dubious to me. I guess someone who has a problem with it wouldn’t choose to be a Waker, but people change over time. How would it be to have a growing conviction that you’re responsible for billions of deaths?
They were never real in the first place, because they were only simulations. If you take the idea of being a Waker seriously, I think you have to reject this one out of hand. (So do I.)
They were never real in the first place, because the Waker’s computational substrate worked out exactly what other people would do without actually instantiating any. It seems unlikely that this is possible. At least some of those other people need to pass a very stringent version of the Turing test.
They were never real in the first place, because the Waker found interactions with them plausible only because the computational substrate diddled their brain to make it seem that way. This seems basically indistinguishable from wireheading.
Any answer that begins “They were never real in the first place” (including those last three, but also any other possibilities I’ve overlooked) is open to the more general objection that a Waker’s existence is (and is known to be) unreal and inauthentic and doesn’t involve any interactions with actual other people.
[EDITED to add: Oh, I thought of another two possibilities but they’re also awful:]
They are immediately replaced by someone else exactly the same as them, so no one else even notices. Someone else exactly the same would do the same as they did, and someone else almost exactly the same would likely do the same very soon. Someone else exactly the same except suddenly deprived of the Waker’s power? For that to work, it would need to be the case that Wakers might lose their powers at any moment, which seems like it would take most of the fun out of it. -- And if you do replace them with “someone else exactly the same”, isn’t that indistinguishable from saying that every time you attempt to Wake there’s a 50% chance that it won’t work?
They are immediately replaced by someone else near enough for practical purposes. It seems like that would have to be very near, especially if there were other people to whom the Waker was very close. And a world in which people are constantly being created from scratch—and put into situations someone else similar just decided they’d rather leave the world than deal with—seems pretty unpleasant. As with the second option above, I think some Wakers would find themselves with big moral problems.
There are, of course, many variants possible. The one I focus on is largely solipsistic, where all the people are generated by an AI. Keep in mind that AI needs to fully emulate only a handful of personas and they’re largely recycled in transition to a new world. (option 2, then)
I can understand your moral reservations, we should however keep the distinction between real instantiation and an AI’s persona. Imagine reality generating AI as a skilful actor and writer. It generates a great number of personas with different stories, personalities and apparent internal subjectivity. When you read a good book, you usually cannot tell if events and people in it are true or made up; the same goes with skilful improv actor, you cannot tell whether it is a real person or just a persona. In that way they all pass Turing test. However you wouldn’t consider a writer killing a real person, when he ceases to write about some fictional character or an actor killing a real person, when she stops acting.
Of course, you may argue that it makes Waker’s life meaningless, if she is surrounded by pretenders. But it seems silly, her relationship with other people is the same as yours.
My reservations aren’t only moral; they are also psychological: that is, I think it likely (whether or not I am “right” to have the moral reservations I do, whether or not that’s even a meaningful question) that if there were a lot of Wakers, some of them would come to think that they were responsible for billions of deaths, or at least to worry that they might be. And I think that would be a horrific outcome.
When I read a good book, I am not interacting with its characters as I interact with other people in the world. I know how to program a computer to describe a person who doesn’t actually exist in a way indistinguishable from a description of a real ordinary human being. (I.e., take a naturalistic description such as a novelist might write, and just type it into the computer and tell it to write it out again on demand.) The smartest AI researchers on earth are a long way from knowing how to program a computer to behave (in actual interactions) just like an ordinary human being. This is an important difference.
It is at least arguable that emulating someone with enough fidelity to stand up to the kind of inspection our hypothetical “Waker” would be able to give (let’s say) at least dozens of people requires a degree of simulation that would necessarily make those emulated-someones persons. Again, it doesn’t really matter that much whether I’m right, or even whether it’s actually a meaningful question; if a Waker comes to think that it does, then they’re going to be seeing themselves as a mass-murderer.
[EDITED to add: And if our hypothetical Waker doesn’t come to think that, then they’re likely to feel that their entire life involves no real human interaction, which is also very very bad.]
What happens to all the other people in the worlds the Waker leaves? I think I see five possibilities, all of which seem dreadful in one way or another.
Their life goes on but the Waker has mysteriously vanished from it. In this case, a world with Wakers in sounds pretty bad. No one could rely on anything involving other people. (Unless Wakers chose to “wake up” extremely infrequently; say, only in circumstances like those in which someone in our world might choose suicide. But it seems clear that this isn’t what you have in mind.)
They are discarded, no longer being required. So we have, potentially, billions of people who exist only for the amusement of this Waker, and get turned off when the Waker loses interest. That seems morally dubious to me. I guess someone who has a problem with it wouldn’t choose to be a Waker, but people change over time. How would it be to have a growing conviction that you’re responsible for billions of deaths?
They were never real in the first place, because they were only simulations. If you take the idea of being a Waker seriously, I think you have to reject this one out of hand. (So do I.)
They were never real in the first place, because the Waker’s computational substrate worked out exactly what other people would do without actually instantiating any. It seems unlikely that this is possible. At least some of those other people need to pass a very stringent version of the Turing test.
They were never real in the first place, because the Waker found interactions with them plausible only because the computational substrate diddled their brain to make it seem that way. This seems basically indistinguishable from wireheading.
Any answer that begins “They were never real in the first place” (including those last three, but also any other possibilities I’ve overlooked) is open to the more general objection that a Waker’s existence is (and is known to be) unreal and inauthentic and doesn’t involve any interactions with actual other people.
[EDITED to add: Oh, I thought of another two possibilities but they’re also awful:]
They are immediately replaced by someone else exactly the same as them, so no one else even notices. Someone else exactly the same would do the same as they did, and someone else almost exactly the same would likely do the same very soon. Someone else exactly the same except suddenly deprived of the Waker’s power? For that to work, it would need to be the case that Wakers might lose their powers at any moment, which seems like it would take most of the fun out of it. -- And if you do replace them with “someone else exactly the same”, isn’t that indistinguishable from saying that every time you attempt to Wake there’s a 50% chance that it won’t work?
They are immediately replaced by someone else near enough for practical purposes. It seems like that would have to be very near, especially if there were other people to whom the Waker was very close. And a world in which people are constantly being created from scratch—and put into situations someone else similar just decided they’d rather leave the world than deal with—seems pretty unpleasant. As with the second option above, I think some Wakers would find themselves with big moral problems.
There are, of course, many variants possible. The one I focus on is largely solipsistic, where all the people are generated by an AI. Keep in mind that AI needs to fully emulate only a handful of personas and they’re largely recycled in transition to a new world. (option 2, then)
I can understand your moral reservations, we should however keep the distinction between real instantiation and an AI’s persona. Imagine reality generating AI as a skilful actor and writer. It generates a great number of personas with different stories, personalities and apparent internal subjectivity. When you read a good book, you usually cannot tell if events and people in it are true or made up; the same goes with skilful improv actor, you cannot tell whether it is a real person or just a persona. In that way they all pass Turing test. However you wouldn’t consider a writer killing a real person, when he ceases to write about some fictional character or an actor killing a real person, when she stops acting.
Of course, you may argue that it makes Waker’s life meaningless, if she is surrounded by pretenders. But it seems silly, her relationship with other people is the same as yours.
My reservations aren’t only moral; they are also psychological: that is, I think it likely (whether or not I am “right” to have the moral reservations I do, whether or not that’s even a meaningful question) that if there were a lot of Wakers, some of them would come to think that they were responsible for billions of deaths, or at least to worry that they might be. And I think that would be a horrific outcome.
When I read a good book, I am not interacting with its characters as I interact with other people in the world. I know how to program a computer to describe a person who doesn’t actually exist in a way indistinguishable from a description of a real ordinary human being. (I.e., take a naturalistic description such as a novelist might write, and just type it into the computer and tell it to write it out again on demand.) The smartest AI researchers on earth are a long way from knowing how to program a computer to behave (in actual interactions) just like an ordinary human being. This is an important difference.
It is at least arguable that emulating someone with enough fidelity to stand up to the kind of inspection our hypothetical “Waker” would be able to give (let’s say) at least dozens of people requires a degree of simulation that would necessarily make those emulated-someones persons. Again, it doesn’t really matter that much whether I’m right, or even whether it’s actually a meaningful question; if a Waker comes to think that it does, then they’re going to be seeing themselves as a mass-murderer.
[EDITED to add: And if our hypothetical Waker doesn’t come to think that, then they’re likely to feel that their entire life involves no real human interaction, which is also very very bad.]