Do you bite the bullet that this means the set of things you morally value changes discontinuously and predictably as things move out of your light cone? (Or is there some way you value things less as they are less “in” your light cone, in some nonbinary way?)
It seems natural to weight things according to how much I expect to be able to interact with them.
Obviously that means my weightings can change if I unexpectedly gain or lose the ability to interact with things, but I can’t immediately think of any major problems with that.
Do you bite the bullet that this means the set of things you morally value changes discontinuously and predictably as things move out of your light cone? (Or is there some way you value things less as they are less “in” your light cone, in some nonbinary way?)
It seems natural to weight things according to how much I expect to be able to interact with them.
Obviously that means my weightings can change if I unexpectedly gain or lose the ability to interact with things, but I can’t immediately think of any major problems with that.
Yeah, that feels like one of the hypotheses I am attracted to, though it feels wrong for a variety of other reasons.