There is no situation involving 3^^^3 people which will ever have a high probability.
Really? No situation? Not even if we discover new laws of physics that allow us to have infinite computing power?
Telling me I need to adopt a utility function which will handle such situations well is trying to mug me, because such situations will never come up.
We are talking about utility functions. Probability is irrelevant. All that matters for the utility function is that if the situation came up, you would care about it.
Also, I don’t care about the difference between 3^^^^^3 people and 3^^^^^^3 people even if the probability is 100%, and neither does anyone else.
I totally disagree with you. These numbers are so incomprehensibly huge you can’t picture them in your head, sure. There is massive scope insensitivity. But if you had to make moral choices that affect those two numbers of people, you should always value the bigger number proportionally more.
E.g. if you had to torture 3^^^^^3 to save 3^^^^^^3 from getting dust specks in their eyes. Or make bets involving probabilities between various things happening to the different groups. Etc. I don’t think you can make these decisions correctly if you have a bounded utility function.
If you don’t make them correctly, well that 3^^^3 people probably contains a basically infinite number of copies of you. By making the correct tradeoffs, you maximize the probability that the other versions of yoruself find themselves in a universe with higher utility.
Really? No situation? Not even if we discover new laws of physics that allow us to have infinite computing power?
We are talking about utility functions. Probability is irrelevant. All that matters for the utility function is that if the situation came up, you would care about it.
I totally disagree with you. These numbers are so incomprehensibly huge you can’t picture them in your head, sure. There is massive scope insensitivity. But if you had to make moral choices that affect those two numbers of people, you should always value the bigger number proportionally more.
E.g. if you had to torture 3^^^^^3 to save 3^^^^^^3 from getting dust specks in their eyes. Or make bets involving probabilities between various things happening to the different groups. Etc. I don’t think you can make these decisions correctly if you have a bounded utility function.
If you don’t make them correctly, well that 3^^^3 people probably contains a basically infinite number of copies of you. By making the correct tradeoffs, you maximize the probability that the other versions of yoruself find themselves in a universe with higher utility.