I see it as necessary, because I don’t see Anthropic probabilities as actually meaning anything.
Standard probabilities are informally “what do I expect to see”, and this can be formalised as a cost function for making the wrong predictions.
In Anthropic situations, the “I” in that question is not clear—you, or you and your copies, or you and those similar to you? When you formalise this as cost function, you have to decide how to spread the cost amongst you different copies—do you spread it as a total cost, or an average one? In the first case, SIA emerges; in the second, SSA.
So you can’t talk about anthropic “probabilities” without including how much you care about the cost to your copies.
“So you can’t talk about anthropic “probabilities” without including how much you care about the cost to your copies”—Yeah, but that isn’t anything to do with morality, just individual preferences. And instead of using just a probability, you can define probability and the number of repeats.
I see it as necessary, because I don’t see Anthropic probabilities as actually meaning anything.
Standard probabilities are informally “what do I expect to see”, and this can be formalised as a cost function for making the wrong predictions.
In Anthropic situations, the “I” in that question is not clear—you, or you and your copies, or you and those similar to you? When you formalise this as cost function, you have to decide how to spread the cost amongst you different copies—do you spread it as a total cost, or an average one? In the first case, SIA emerges; in the second, SSA.
So you can’t talk about anthropic “probabilities” without including how much you care about the cost to your copies.
“So you can’t talk about anthropic “probabilities” without including how much you care about the cost to your copies”—Yeah, but that isn’t anything to do with morality, just individual preferences. And instead of using just a probability, you can define probability and the number of repeats.