So even if we assume that AGI would arrive soon unless stopped (e.g. in 5 years) and would result in immediate death if unaligned (which is very far from a given), then it seems like your life expectancy would be vastly longer if AGI developed even if the chance of alignment were quite small.
This naive expected value calculation completely leaves out what it actually means for humanity to come to an end: if you ever reach zero, you cannot keep playing the game. As I said, I would not take this chance even if the odds were 99 to 1 in favor of it going well. It would be deeply unethical to create AGI under that level of uncertainty, especially since the uncertainty may be reduced given time, and our current situation is almost certainly not that favorable.
I am not so egoistic as to value my own life (and even the lives of my loved ones) highly enough to make that choice on everyone else’s behalf, and on behalf of the whole future of known sentient life. But I also don’t personally have any specific wishes to live a very long life myself. I appreciate my life for what it is, and I don’t see any great need to improve it to a magical degree or live for vastly longer. There are people who individually have such terrible lives that it is rational for them to take large risks onto themselves to improve their circumstances, and there are others who simply have a very high appetite for risk. Those situations do not apply to most people.
We have been monkeys in shoes for a very long time. We have lived and suffered and rejoiced and died for eons. It would not be a crime against being for things to keep happening roughly the way they always have, with all of the beauty and horror we have always known. What would be a crime against being is to risk a roughly immediate, permanent end to everything of value, for utopian ideals that are shared by almost none of the victims. Humanity has repeatedly warned about this in our stories about supervillains and ideologue despots alike.
Under our shared reality, there is probably no justification for your view that I would ever accept. In that sense, it is not important to me what your justification is. On the other hand, I do have a model of people who hold your view, which may not resemble you in particular:
I view the willingness to gamble away all of value itself as an expression of ingratitude for the value that we do have, and I view the willingness to do this on everyone else’s behalf as a complete disregard for the inviolable consent of others.
This naive expected value calculation completely leaves out what it actually means for humanity to come to an end: if you ever reach zero, you cannot keep playing the game. As I said, I would not take this chance even if the odds were 99 to 1 in favor of it going well. It would be deeply unethical to create AGI under that level of uncertainty, especially since the uncertainty may be reduced given time, and our current situation is almost certainly not that favorable.
I am not so egoistic as to value my own life (and even the lives of my loved ones) highly enough to make that choice on everyone else’s behalf, and on behalf of the whole future of known sentient life. But I also don’t personally have any specific wishes to live a very long life myself. I appreciate my life for what it is, and I don’t see any great need to improve it to a magical degree or live for vastly longer. There are people who individually have such terrible lives that it is rational for them to take large risks onto themselves to improve their circumstances, and there are others who simply have a very high appetite for risk. Those situations do not apply to most people.
We have been monkeys in shoes for a very long time. We have lived and suffered and rejoiced and died for eons. It would not be a crime against being for things to keep happening roughly the way they always have, with all of the beauty and horror we have always known. What would be a crime against being is to risk a roughly immediate, permanent end to everything of value, for utopian ideals that are shared by almost none of the victims. Humanity has repeatedly warned about this in our stories about supervillains and ideologue despots alike.
Under our shared reality, there is probably no justification for your view that I would ever accept. In that sense, it is not important to me what your justification is. On the other hand, I do have a model of people who hold your view, which may not resemble you in particular: I view the willingness to gamble away all of value itself as an expression of ingratitude for the value that we do have, and I view the willingness to do this on everyone else’s behalf as a complete disregard for the inviolable consent of others.