One is the raw emotion, it seems right in a wordless fashion. Why do people risk their lives to save an unrelated child, as fire fighters do? Saving the human race from extinction seems like the epitome of this ethic.
There are two relevant differences between this and wanting to prevent the extinction of humankind. One is, as I told Jack, that emotions only work for small amounts of people you can see and interact with personally; you can’t really feel the same kind of emotions about humanity.
The other is people have all kinds of irrational, suboptimal, bug-ridden heuristics for taking personal risks; for instance the firefighter might be confident in his ability to survive the fire, even though a lot of the danger doesn’t depend on his actions at all. That’s why I prefer to talk about incurring a certain penalty, like killing one guy to save another, rather than taking a risk.
From my point of view, that you share some interest or way of thinking with me, makes you a bit of me and vice versa, not a large amount but some.
I understand this as a useful rational model, but I confess I can’t identify with this way of thinking at all on an emotional level.
What importance do you attach to actually being you (the subjective thread of experience)? Would you sacrifice your life to save the lives of two atomically precise copies of you that were created a minute ago? If not two, how many? In fact, how could you decide on a precise number?
But from any view I don’t think it matters that much who will become the grown ups. So my own personal continuity through the ages does not seem as important as the survival.
Personal continuity, in the sense of subjective experience, matters very much to me. In fact it probably matters more than the rest of the universe put together.
If Omega offered me great riches and power—or designing a FAI singleton correctly, or anything I wanted—at the price of losing my subjective experience in some way (which I define to be much the same as death, on a personal level) - then I would say no. How about you?
There are two relevant differences between this and wanting to prevent the extinction of humankind. One is, as I told Jack, that emotions only work for small amounts of people you can see and interact with personally; you can’t really feel the same kind of emotions about humanity.
The other is people have all kinds of irrational, suboptimal, bug-ridden heuristics for taking personal risks; for instance the firefighter might be confident in his ability to survive the fire, even though a lot of the danger doesn’t depend on his actions at all. That’s why I prefer to talk about incurring a certain penalty, like killing one guy to save another, rather than taking a risk.
I understand this as a useful rational model, but I confess I can’t identify with this way of thinking at all on an emotional level.
What importance do you attach to actually being you (the subjective thread of experience)? Would you sacrifice your life to save the lives of two atomically precise copies of you that were created a minute ago? If not two, how many? In fact, how could you decide on a precise number?
Personal continuity, in the sense of subjective experience, matters very much to me. In fact it probably matters more than the rest of the universe put together.
If Omega offered me great riches and power—or designing a FAI singleton correctly, or anything I wanted—at the price of losing my subjective experience in some way (which I define to be much the same as death, on a personal level) - then I would say no. How about you?