Eliezer, also consider this: suppose I am a mad scientist trying to decide between making one copy of Eliezer and torturing it for 50 years, or on the other hand, making 1000 copies of Eliezer and torturing them all for 50 years.
The second possibility is much, much worse for you personally. For in the first possibility, you would subjectively have a 50% chance of being tortured. But in the second possibility, you would have a subjective chance of 99.9% of being tortured. This implies that the second possibility is much worse, so creating copies of bad experiences multiplies, even without diversity. But this implies that copies of good experiences should also multiply: if I make a million copies of Eliezer having billions of units of utility, this would be much better than making only one, which would give you only a 50% chance of experiencing this.
Eliezer, also consider this: suppose I am a mad scientist trying to decide between making one copy of Eliezer and torturing it for 50 years, or on the other hand, making 1000 copies of Eliezer and torturing them all for 50 years.
The second possibility is much, much worse for you personally. For in the first possibility, you would subjectively have a 50% chance of being tortured. But in the second possibility, you would have a subjective chance of 99.9% of being tortured. This implies that the second possibility is much worse, so creating copies of bad experiences multiplies, even without diversity. But this implies that copies of good experiences should also multiply: if I make a million copies of Eliezer having billions of units of utility, this would be much better than making only one, which would give you only a 50% chance of experiencing this.