The Extinction Dilemma
Inspiration came from Against the Linear Utility Hypothesis and the Leverage Penalty
Place a value on the utility of a utopia with 1 human, let’s call this X.
Place a value on the negutility of all of humanity going extinct. Let’s call this Z.
Decide if your utility function is linear in number of lives saved.
Place a value on the utility of say humans achieving a perfect utopia. Call this K.
Omega offers you a bet. This bet has a 50% chance of humanity reaching a Kardashev type V civilisation (colonising the multiverse) and a perfect utopia, and a 50% chance of human extinction this instant.
Omega is an omnipotent entity that always tells the truth, you know (and believe this), etc.
Do you accept the bet?
What about 1:3 odds?
What about 3:1 odds?
What is the highest probability of extinction at which you would accept Omega’s offer?
What does this say about your utility function?
There is no finite number of lives that reach utopia, for which I would accept Omega’s bet at a 90% chance of extinction.
This does not mean I would accept Omega’s bet at 89% chance of extinction (I wouldn’t), but 90% is far above my ceiling, and I’m very sure of it.