Sure you can make a non-me that I prefer to me. I’m somewhat selfish, but I think I put weight on future universe states in themselves.
I can make a non-you that you would prefer to you?
Sure, I may try to stop you from making one though. Depends on the non-me I guess.
Konkvistador wakes up in a dimly lit perfectly square white room sitting on a chair staring at Omega
Omega: “Either you or your daughter can die. Here I have this neat existence machine when I turn it on in a few minutes, I can set the lever to you or Anna, the one that’s selected is reimplemented in atoms back on Earth, the other isn’t and his pattern is deleted. ”
Me: “I can’t let Anna die. Take me!”
Omega: “Ok, but before we do that. You can keep on existing or I make a daughter exist in your place.”
Me: “What Anna?
Omega: “No no a new daughter.”
Me: “Eh no thanks.”
Omega: “Are you sure?”
Me: “Pretty much.”
Omega: “Ok what if I gave you all the memories of several years spent with her?”
Me: “That would make her the same to me as Anna.”
Omega: “It would. Indeed I may have done that already. Anna may or may not exist currently. So about the memories of an extra daughter …”
Me: “No thanks.”
Omega: “Ok ok, would you like your memories of Anna taken away too?”
Me: “No.”
Omega: “Anna is totally made up, I swear!”
Me: “It seemed probable, the answer is no regardless. And yes to saving Anna’s life.”
Konkvistador dies and somewhere on Earth in a warm bed Anna awakes and takes her first breath, except no one knows it is her first time.
Just to make sure I understood: you can value the existence of a nonexistent person of whom you have memories that you know are delusional more than you value your own continued existence, as long as those memories contain certain properties. Yes?
So, same question: can you say more about what those properties are? (I gather from your example that being your daughter can be one of them, for example.)
Also… is it important that they be memories? That is, if instead of delusional memories of your time with Anna, you had been given daydreams about Anna’s imagined future life, or been given a book about a fictional daughter named Anna, might you make the same choice?
Just to make sure I understood: you can value the existence of a nonexistent person of whom you have memories that you know are delusional more than you value your own continued existence, as long as those memories contain certain properties. Yes?
Yes.
So, same question: can you say more about what those properties are? (I gather from your example that being your daughter can be one of them, for example.)
I discovered the daughter example purely empirically when doing thought experiments. It seems plausible there are other examples.
Also… is it important that they be memories? That is, if instead of delusional memories of your time with Anna, you had been given daydreams about Anna’s imagined future life, or been given a book about a fictional daughter named Anna, might you make the same choice?
Both of these would have significantly increased the probability that I would choose Anna over myself, but I think the more likley course of action is that I would choose myself.
If I have memories of Anna and my life with her, I find basically find myself in the “wrong universe” so to speak. The universe where Anna and my life for the past few years didn’t happen. I have the possibility to save either Anna or myself by putting one of us back in the right universe (turning this one into the one in my memory).
In any case I’m pretty sure that Omega can write sufficiently good books to want you to value Anna or Alice or Bob above your life. He could even probably make a good enough picture of a “Anna” or “Alice” or “Bob” for you to want her/him to live even at the expense of your own life.
Suppose one day you are playing around with some math and you discover a description of … I hope you can see where I’m going with this. Not knowing the relevant data set about the theoretical object, Anna, Bob or Cthulhu, you may not want to learn of them if you think it will make you want to prefer their existence to your own. But once you know them by definition you value their existence above your own.
This brings up some interesting associations not just with Basilisks but also with CEV in my mind.
Sure you can make a non-me that I prefer to me. I’m somewhat selfish, but I think I put weight on future universe states in themselves.
Sure, I may try to stop you from making one though. Depends on the non-me I guess.
Just to make sure I understood: you can value the existence of a nonexistent person of whom you have memories that you know are delusional more than you value your own continued existence, as long as those memories contain certain properties. Yes?
So, same question: can you say more about what those properties are? (I gather from your example that being your daughter can be one of them, for example.)
Also… is it important that they be memories? That is, if instead of delusional memories of your time with Anna, you had been given daydreams about Anna’s imagined future life, or been given a book about a fictional daughter named Anna, might you make the same choice?
Yes.
I discovered the daughter example purely empirically when doing thought experiments. It seems plausible there are other examples.
Both of these would have significantly increased the probability that I would choose Anna over myself, but I think the more likley course of action is that I would choose myself.
If I have memories of Anna and my life with her, I find basically find myself in the “wrong universe” so to speak. The universe where Anna and my life for the past few years didn’t happen. I have the possibility to save either Anna or myself by putting one of us back in the right universe (turning this one into the one in my memory).
In any case I’m pretty sure that Omega can write sufficiently good books to want you to value Anna or Alice or Bob above your life. He could even probably make a good enough picture of a “Anna” or “Alice” or “Bob” for you to want her/him to live even at the expense of your own life.
Suppose one day you are playing around with some math and you discover a description of … I hope you can see where I’m going with this. Not knowing the relevant data set about the theoretical object, Anna, Bob or Cthulhu, you may not want to learn of them if you think it will make you want to prefer their existence to your own. But once you know them by definition you value their existence above your own.
This brings up some interesting associations not just with Basilisks but also with CEV in my mind.
(deleted, this was just a readability fix for the above)
Fixed. Sorry for that.