I cannot fathom the confusion that would lead to this question. Of course it’s better for humanity to survive than to not survive. Of course it’s better to go extinct in a million years than to go extinct now. The future is more wondrous and less scary than you imagine.
Of course it’s better for humanity to survive than to not survive
That only makes sense if you think life is always better than death. But that certainly isn’t my view—I think some possible futures are so bad that extinction would be preferable. In that case, the answer to the title question depends on the probabilities of such futures.
EDIT: For the record, I don’t think we need to resort to pulling the plug on ourselves anytime soon.
I don’t think life is always better than death according to my utility function.
I do however think that the most likley outcome considering the priorities of the blind idiot god or perhaps even self described benevolent minds is that inhabitants of such spaces in the very long term, are minds who are quite ok being there.
On “Benevolent” minds:
If I knew beyond a doubt that something which I would consider hell exists and that everyone goes there after being resurrected on judgment day, and I also knew that it was very unlikely that I could stop everyone from ever being born or being resurrected I would opt for trying to change or create people that would enjoy living in that hell.
My question was not meant to be interpreted literally but was rather instrumental in highlighting the idea of what if it is more likely that maximizing utility not only fails but rather it turns out that the overall utility is minimized, i.e. the amount of suffering increasing. Instrumentally, isn’t it better to believe that winning is impossible, than that it’s likely, if the actual probability is very low?
To decide to lose intentionally, I need to know how much it costs to try to win, what the odds of success are, and what the difference in utility is if I win.
I feel like people weigh those factors unconsciously and automatically (using bounded resources and rarely with perfect knowledge or accuracy).
I cannot fathom the confusion that would lead to this question. Of course it’s better for humanity to survive than to not survive. Of course it’s better to go extinct in a million years than to go extinct now. The future is more wondrous and less scary than you imagine.
That only makes sense if you think life is always better than death. But that certainly isn’t my view—I think some possible futures are so bad that extinction would be preferable. In that case, the answer to the title question depends on the probabilities of such futures.
EDIT: For the record, I don’t think we need to resort to pulling the plug on ourselves anytime soon.
I don’t think life is always better than death according to my utility function.
I do however think that the most likley outcome considering the priorities of the blind idiot god or perhaps even self described benevolent minds is that inhabitants of such spaces in the very long term, are minds who are quite ok being there.
On “Benevolent” minds: If I knew beyond a doubt that something which I would consider hell exists and that everyone goes there after being resurrected on judgment day, and I also knew that it was very unlikely that I could stop everyone from ever being born or being resurrected I would opt for trying to change or create people that would enjoy living in that hell.
My question was not meant to be interpreted literally but was rather instrumental in highlighting the idea of what if it is more likely that maximizing utility not only fails but rather it turns out that the overall utility is minimized, i.e. the amount of suffering increasing. Instrumentally, isn’t it better to believe that winning is impossible, than that it’s likely, if the actual probability is very low?
To decide to lose intentionally, I need to know how much it costs to try to win, what the odds of success are, and what the difference in utility is if I win.
I feel like people weigh those factors unconsciously and automatically (using bounded resources and rarely with perfect knowledge or accuracy).