By this reasoning, we should treat the chance of AI killing half the world as 50%, and the chance of AI killing 1⁄4 the world as 50%, the chance of either AI or a meteor killing the world as 50%, etc.
And you then have to estimate the chances of electrons or video game characters being sentient. It’s nonzero, right? Maybe electrons only have a 10^-20 chance of being sentient.
I think the probability that electrons are sentient is much higher than 10−20. Nonetheless, that doesn’t convince me that electron well-being matters far more than anything else.
I don’t have an unbounded utility function where I chase extremely small probabilities of extremely big utilities (Pascal’s Mugging).
By this reasoning, we should treat the chance of AI killing half the world as 50%, and the chance of AI killing 1⁄4 the world as 50%, the chance of either AI or a meteor killing the world as 50%, etc.
And you then have to estimate the chances of electrons or video game characters being sentient. It’s nonzero, right? Maybe electrons only have a 10^-20 chance of being sentient.
I think the probability that electrons are sentient is much higher than 10−20. Nonetheless, that doesn’t convince me that electron well-being matters far more than anything else.
I don’t have an unbounded utility function where I chase extremely small probabilities of extremely big utilities (Pascal’s Mugging).