One is the raw emotion, it seems right in a wordless fashion. Why do people risk their lives to save an unrelated child, as fire fighters do? Saving the human race from extinction seems like the epitome of this ethic.
Then there is the attempt to find a rationale for this feeling, the number of arguments I have had with myself to give some reason to why I might feel this way. Or at least why it is not a very bad idea to feel this way.
My view of identity is something like the idea of genetic relatedness. If someone made an atom level copy of you, that’d be the same person pretty much right? Because it shares the same beliefs, desires and view point on the world. But most humans share some beliefs and desires. From my point of view, that you share some interest or way of thinking with me, makes you a bit of me and vice versa, not a large amount but some. We are identity kin as well as all sharing lots of the same genetic code (as we do with animals). So even if I die parts of me are in everyone even if not as obvious as they are with my friends. We are all mental descendants of Newton and Einstein and share that heritage. Not all things about humanity (or about me) are to be cherished, so I do not preach universal love and peace. But wiping out humanity would remove all of those spread out bits of me.
Making self-sacrifice easier is the fact I’m not sure that me surviving as a post human will preserve much of my current identity. In some way I hope it doesn’t as I am not psychologically ready for grown up (on the cosmic scale) choices but I wish to be. In other ways I am afraid that things of value will be lost that don’t need to be. But from any view I don’t think it matters that much who will become the grown ups. So my own personal continuity through the ages does not seem as important as the survival.
I think my friends would also share the same word less emotion to save humanity, but not the odd wordy view of identity I have.
One is the raw emotion, it seems right in a wordless fashion. Why do people risk their lives to save an unrelated child, as fire fighters do? Saving the human race from extinction seems like the epitome of this ethic.
There are two relevant differences between this and wanting to prevent the extinction of humankind. One is, as I told Jack, that emotions only work for small amounts of people you can see and interact with personally; you can’t really feel the same kind of emotions about humanity.
The other is people have all kinds of irrational, suboptimal, bug-ridden heuristics for taking personal risks; for instance the firefighter might be confident in his ability to survive the fire, even though a lot of the danger doesn’t depend on his actions at all. That’s why I prefer to talk about incurring a certain penalty, like killing one guy to save another, rather than taking a risk.
From my point of view, that you share some interest or way of thinking with me, makes you a bit of me and vice versa, not a large amount but some.
I understand this as a useful rational model, but I confess I can’t identify with this way of thinking at all on an emotional level.
What importance do you attach to actually being you (the subjective thread of experience)? Would you sacrifice your life to save the lives of two atomically precise copies of you that were created a minute ago? If not two, how many? In fact, how could you decide on a precise number?
But from any view I don’t think it matters that much who will become the grown ups. So my own personal continuity through the ages does not seem as important as the survival.
Personal continuity, in the sense of subjective experience, matters very much to me. In fact it probably matters more than the rest of the universe put together.
If Omega offered me great riches and power—or designing a FAI singleton correctly, or anything I wanted—at the price of losing my subjective experience in some way (which I define to be much the same as death, on a personal level) - then I would say no. How about you?
There are two possible answers to this.
One is the raw emotion, it seems right in a wordless fashion. Why do people risk their lives to save an unrelated child, as fire fighters do? Saving the human race from extinction seems like the epitome of this ethic.
Then there is the attempt to find a rationale for this feeling, the number of arguments I have had with myself to give some reason to why I might feel this way. Or at least why it is not a very bad idea to feel this way.
My view of identity is something like the idea of genetic relatedness. If someone made an atom level copy of you, that’d be the same person pretty much right? Because it shares the same beliefs, desires and view point on the world. But most humans share some beliefs and desires. From my point of view, that you share some interest or way of thinking with me, makes you a bit of me and vice versa, not a large amount but some. We are identity kin as well as all sharing lots of the same genetic code (as we do with animals). So even if I die parts of me are in everyone even if not as obvious as they are with my friends. We are all mental descendants of Newton and Einstein and share that heritage. Not all things about humanity (or about me) are to be cherished, so I do not preach universal love and peace. But wiping out humanity would remove all of those spread out bits of me.
Making self-sacrifice easier is the fact I’m not sure that me surviving as a post human will preserve much of my current identity. In some way I hope it doesn’t as I am not psychologically ready for grown up (on the cosmic scale) choices but I wish to be. In other ways I am afraid that things of value will be lost that don’t need to be. But from any view I don’t think it matters that much who will become the grown ups. So my own personal continuity through the ages does not seem as important as the survival.
I think my friends would also share the same word less emotion to save humanity, but not the odd wordy view of identity I have.
There are two relevant differences between this and wanting to prevent the extinction of humankind. One is, as I told Jack, that emotions only work for small amounts of people you can see and interact with personally; you can’t really feel the same kind of emotions about humanity.
The other is people have all kinds of irrational, suboptimal, bug-ridden heuristics for taking personal risks; for instance the firefighter might be confident in his ability to survive the fire, even though a lot of the danger doesn’t depend on his actions at all. That’s why I prefer to talk about incurring a certain penalty, like killing one guy to save another, rather than taking a risk.
I understand this as a useful rational model, but I confess I can’t identify with this way of thinking at all on an emotional level.
What importance do you attach to actually being you (the subjective thread of experience)? Would you sacrifice your life to save the lives of two atomically precise copies of you that were created a minute ago? If not two, how many? In fact, how could you decide on a precise number?
Personal continuity, in the sense of subjective experience, matters very much to me. In fact it probably matters more than the rest of the universe put together.
If Omega offered me great riches and power—or designing a FAI singleton correctly, or anything I wanted—at the price of losing my subjective experience in some way (which I define to be much the same as death, on a personal level) - then I would say no. How about you?