See also a mathematical result of some relevance: “Convergence of Expected Utilities with Algorithmic Probability Distributions” by Peter de Blanc. He proves that in a certain setting, all expected utility calculations diverge.
There is also Nick Bostrom’s paper “Infinite Ethics”, analysing the problems of aggregative consequentialism when there is a finite probability of infinite value being at stake, and his “Astronomical Waste” where there are a mere 1038 lives at stake, 1029 of them going to waste every second we delay our conquest of the universe.
In 2007 Eliezer declared himself confused on the subject. I don’t know if he has since found an answer that satisfies him.
However, none of these address the problem of uncertainty, which appears in them only in the probabilities that go into the expected utility calculations. Uncertainty about the probabilities is just folded into the probabilities. In these settings there is no such thing as uncertainty that cannot be expressed as probability.
See also a mathematical result of some relevance: “Convergence of Expected Utilities with Algorithmic Probability Distributions” by Peter de Blanc. He proves that in a certain setting, all expected utility calculations diverge.
There is also Nick Bostrom’s paper “Infinite Ethics”, analysing the problems of aggregative consequentialism when there is a finite probability of infinite value being at stake, and his “Astronomical Waste” where there are a mere 1038 lives at stake, 1029 of them going to waste every second we delay our conquest of the universe.
In 2007 Eliezer declared himself confused on the subject. I don’t know if he has since found an answer that satisfies him.
However, none of these address the problem of uncertainty, which appears in them only in the probabilities that go into the expected utility calculations. Uncertainty about the probabilities is just folded into the probabilities. In these settings there is no such thing as uncertainty that cannot be expressed as probability.