You’ve presented this as a scenario in which you have to make a choice between two conflicting theories. But the problem you face isn’t should I choose X or should I choose Y; the problem you face is that given you have this conflict, what should I do now. This problem is objective, it is different to the problem of whether X is right or Y is right, and it is solvable. Given that this is the year 2050 and humanity won’t in fact be wanting, the best solution to the problem may be to wait, pending further research to resolve the conflict. This isn’t an implicit bet against X and for Y, it is a solution to a different problem to the problems X and Y address.
For sake of argument we say that the plant requires a rare and unstable isotope to get started. Earth’s entire supply is contained in the plant and will decay in 24 hours.
I could also ask you a similar dilemma, but this time there is only one theory which acknowledges that whether the plant works or creates a black hole depends on a single quantum event, which has a 50% chance of going either way. What do you do? If you wouldn’t launch I can ask the same question but now there’s only a 25% chance of a black hole, and so on until I learn the ratio of the utility values that you assign to “post scarcity future” and “extinction of humanity”. This might for example tell me that the chance of a black hole has to be less than 30% for you to press the button.
Then I ask you the original dilemma, and learn whether the probability you assign to theory X is above or below 70%. If I have far too much time on my hands I can keep modifying the dilemma with slightly altered pay-offs until I pinpoint your estimate.
This avoids the question. If it helps, try to construct a version of this in the least convenient possible world. For example, one obvious thing to do would be that something about theory X means the plant can only be turned on at a certain celestial conjunction (yes, this is silly but it gets the point across. That’s why it is a least convenient world) and otherwise would need to wait a thousand years.
One can vary the situation. For example, it might be that under theory X, medicine A will save a terminally ill cancer patient, and under theory Y, medicine B will save them. And A and B together will kill the patient according to both theories.
You’ve presented this as a scenario in which you have to make a choice between two conflicting theories. But the problem you face isn’t should I choose X or should I choose Y; the problem you face is that given you have this conflict, what should I do now. This problem is objective, it is different to the problem of whether X is right or Y is right, and it is solvable. Given that this is the year 2050 and humanity won’t in fact be wanting, the best solution to the problem may be to wait, pending further research to resolve the conflict. This isn’t an implicit bet against X and for Y, it is a solution to a different problem to the problems X and Y address.
For sake of argument we say that the plant requires a rare and unstable isotope to get started. Earth’s entire supply is contained in the plant and will decay in 24 hours.
I could also ask you a similar dilemma, but this time there is only one theory which acknowledges that whether the plant works or creates a black hole depends on a single quantum event, which has a 50% chance of going either way. What do you do? If you wouldn’t launch I can ask the same question but now there’s only a 25% chance of a black hole, and so on until I learn the ratio of the utility values that you assign to “post scarcity future” and “extinction of humanity”. This might for example tell me that the chance of a black hole has to be less than 30% for you to press the button.
Then I ask you the original dilemma, and learn whether the probability you assign to theory X is above or below 70%. If I have far too much time on my hands I can keep modifying the dilemma with slightly altered pay-offs until I pinpoint your estimate.
I suppose you get that when the container containing the black dye explodes....
Damn, I made that mistake every single time I typed it and I thought I’d corrected them all.
This avoids the question. If it helps, try to construct a version of this in the least convenient possible world. For example, one obvious thing to do would be that something about theory X means the plant can only be turned on at a certain celestial conjunction (yes, this is silly but it gets the point across. That’s why it is a least convenient world) and otherwise would need to wait a thousand years.
One can vary the situation. For example, it might be that under theory X, medicine A will save a terminally ill cancer patient, and under theory Y, medicine B will save them. And A and B together will kill the patient according to both theories.