Personally, I would modus tollens this and take it as an example of why it’s absurd to morally value things in other universes or outside my light cone.
Do you bite the bullet that this means the set of things you morally value changes discontinuously and predictably as things move out of your light cone? (Or is there some way you value things less as they are less “in” your light cone, in some nonbinary way?)
It seems natural to weight things according to how much I expect to be able to interact with them.
Obviously that means my weightings can change if I unexpectedly gain or lose the ability to interact with things, but I can’t immediately think of any major problems with that.
Yeah, that is also roughly my response, though the other thought experiment I suggested in another comment feels very similar to me, and that reduction doesn’t really work in that case (though similar reductions kind of do). Interested in your response to that question.
Personally, I would modus tollens this and take it as an example of why it’s absurd to morally value things in other universes or outside my light cone.
Do you bite the bullet that this means the set of things you morally value changes discontinuously and predictably as things move out of your light cone? (Or is there some way you value things less as they are less “in” your light cone, in some nonbinary way?)
It seems natural to weight things according to how much I expect to be able to interact with them.
Obviously that means my weightings can change if I unexpectedly gain or lose the ability to interact with things, but I can’t immediately think of any major problems with that.
Yeah, that feels like one of the hypotheses I am attracted to, though it feels wrong for a variety of other reasons.
Yeah, that is also roughly my response, though the other thought experiment I suggested in another comment feels very similar to me, and that reduction doesn’t really work in that case (though similar reductions kind of do). Interested in your response to that question.