I should clarify that resource-compatibility is a claim about the mundane and exotic values humans actually hold. It’s a contingent, not a necessary. Yes, some people think the natural world is a hell-pit of suffering (negative utilitarians like Brian Tomasik), but they’re typically scope-sensitive and longtermist, so they’d care far more about the distal resources.
You could construct a value profile like “utility = −1 if suffering exists on Earth, else 0” which is exotic values seeking proximal resources. I don’t have a good answer for handling such cases. But empirically, this value profile seems rare.
I agree that this is an empirical claim, in fact, this is pretty much the major interesting question here! Please say more about why you think this is the case. Empirically, Brian Tomasik does exist, as do nature-has-intrinsic-value hippies, so somebody is definitely getting shafted in any given future.
My intuitions expect that in the limit of infinite power, human moral intuitions come apart quite a lot more than you’ve addressed here (for example, I think the utopia described here throws away >90% of the value in the universe for no reason, maybe more depending on how much of the universe gets converted to -oniums).
I also think that any attempt to consider some kind of moral value compromise should probably think about what kinds of process we would actually expect to come up with a nice compromise like this, for example, it seems very unlikely to me that a moral value war would lead to a good ending like this.
I agree that this is an empirical claim, in fact, this is pretty much the major interesting question here! Please say more about why you think this is the case. Empirically, Brian Tomasik does exist, as do nature-has-intrinsic-value hippies, so somebody is definitely getting shafted in any given future.
My intuitions expect that in the limit of infinite power, human moral intuitions come apart quite a lot more than you’ve addressed here (for example, I think the utopia described here throws away >90% of the value in the universe for no reason, maybe more depending on how much of the universe gets converted to -oniums).
I also think that any attempt to consider some kind of moral value compromise should probably think about what kinds of process we would actually expect to come up with a nice compromise like this, for example, it seems very unlikely to me that a moral value war would lead to a good ending like this.