Nick Bostrom’s TED talk and setting priorities

I just watched Nick Bostrom’s TED talk titled “Humanity’s biggest problems aren’t what you think they are.” I was expecting a talk giving Bostrom’s take on what he thought the three biggest existential (or at least catastrophic) risks are, but instead, “existential risk” was just one item on the list. The other two were “death” and “life isn’t usually as wonderful as it could be.”

Putting these other two in the same category as “existential risk” seems like a mistake. This seems especially obvious in the case of (the present, normal rate of) death and existential risk. Bostrom’s talk gives an annual death rate of 56 million, whereas if you take future generations into account, a 1% reduction in existential risk could save 10^32 lives.

More importantly, if we screw up solving “death” and “life isn’t usually as wonderful as it could be” in the next century, there will be other centuries where we can solve them. On the other hand, if we screw up existential risk in the next century, it means that’s it, humanity’s run will be over. There are no second chances when it comes to averting existential risk.

One possible counter argument is that the sooner we solve “death” and “life isn’t usually as wonderful as it could be,” the sooner we can start spreading our utopia throughout the galaxy and even to other galaxies, and with exponential growth a century head start on that could lead to a manyfold increase in the number of utils in the history of the universe.

However, given the difficulties of building probes that travel at even a significant fraction of the speed of light, and the fact that colonizing new star systems may be a slow process even with advanced nanotech, a century may not matter much when it comes to colonizing the galaxy. Furthermore, colonizing the galaxy (or universe) may not be the sort of thing that follows an exponential curve, it may follow a cubic curve as probes spread out in a sphere.

So I lean towards thinking that averting existential risks should be a much higher priority than creating a death-free, always wonderful utopia. Or maybe not. Either way, the answer would seem to be very important for questions like how we should focus our resources, and also whether your should push the button to turn on a machine that will allegedly create a utopia.