My answer to “If AI wipes out humanity and colonizes the universe itself, the future will go about as well as if humanity had survived (or better)” is pretty much defined by how the question is interpreted. It could swing pretty wildly, but the obvious interpretation seems ~tautologically bad.
Agreed, I can imagine very different ways of getting a number for that, even given probability distributions for how good the future will be conditional on each of the two scenarios.
A stylized example: say that the AI-only future has a 99% chance of being mediocre and a 1% chance of being great, and the human future has a 60% chance of being mediocre and a 40% chance of being great. Does that give an answer of 1% or 60% or something else?
I’m also not entirely clear on what scenario I should be imagining for the “humanity had survived (or better)” case.
I’m also not entirely clear on what scenario I should be imagining for the “humanity had survived (or better)” case.
I think that one is supposed to be parsed as “If AI wipes out humanity and colonizes the universe itself, the future will go about as well as, or go better than, if humanity had survived” rather than “If AI wipes out humanity and colonizes the universe itself, the future will go about as well as if humanity had survived or done better than survival”.
Agreed, I can imagine very different ways of getting a number for that, even given probability distributions for how good the future will be conditional on each of the two scenarios.
A stylized example: say that the AI-only future has a 99% chance of being mediocre and a 1% chance of being great, and the human future has a 60% chance of being mediocre and a 40% chance of being great. Does that give an answer of 1% or 60% or something else?
I’m also not entirely clear on what scenario I should be imagining for the “humanity had survived (or better)” case.
I think that one is supposed to be parsed as “If AI wipes out humanity and colonizes the universe itself, the future will go about as well as, or go better than, if humanity had survived” rather than “If AI wipes out humanity and colonizes the universe itself, the future will go about as well as if humanity had survived or done better than survival”.