Hmm, it seems that from your perspective “do non-consensual uploads (which humans would probably later be fine with) count as death” is actually a crux for fatality questions. I feel like this is a surprising place to end up because I think keeping humans physically alive isn’t much more expensive and I expect a bunch of the effort to keep humans alive to be motivated by fulfilling their preferences (in a non-bastardized form) rather than by something else.
Intuitively, I feel tempted to call it not death if people would be fine with it on reflection but it seems like a mess and either way not that important.
the only people you’re losing bargaining power with are the few aliens who strongly prefer “unmodified solar system continued” vs “reconstructing original unmodified solar system after the fact”
What about people who want you to not do things to the humans that they consider as bad as death (at least without further reflection).
Intuitively, I feel tempted to call it not death if people would be fine with it on reflection but it seems like a mess and either way not that important.
Nod, I think this is both fine, and, also, resolving the other which way would be fine.
“do non-consensual uploads (which humans would probably later be fine with) count as death” is actually a crux for fatality questions.
On my end the crux is more like “the space of things aliens could care about is so vast, it just seems so unlikely for it to line up exactly with the preferences of currently living humans.” (I agree “respect boundaries” is a schelling value that probably has disproportionate weight, but there’s still a lot of degree of freedom of how to implement that, and how to trade for it, and whether acausal economies have a lot of Very Oddly Specific Trades (i.e. saving a very specific group) going on that would cover it.
The question of whether “nonconsensual uploads that you maybe endorse later” is a question I end up focused on mostly because you’re rejecting the previous paragraph,
What about people who want you to not do things to the humans that they consider as bad as death (at least without further reflection).
I agree that’s a thing, just, there’s lots of other things aliens could want.
(Not sure if cruxy, but, I think the aliens will care about respecting our agency more like the way we care about respecting trees agency, than the way we care about respecting dogs agency)
Or: “we will be more like trees than like dogs to them.” Seems quite plausible they might be more wisely benevolent towards us than humans are towards trees currently.
But, it seems like an important intuition pump for how they’d be engaging with us and what sort of moral reflection they’d have to be doing.
i.e. on the “bacteria → trees → cats → humans → weakly superhuman LLM → … ??? … → Jupiter Brain that does acausal trades” spectrum of coherent agency and intelligence, it’s not obvious we’re more like Jupiter Brains or like trees.
(somewhere there’s a nice Alex Flint post about how you would try to help a tree if you were vaguely aligned to it)
Hmm, it seems that from your perspective “do non-consensual uploads (which humans would probably later be fine with) count as death” is actually a crux for fatality questions. I feel like this is a surprising place to end up because I think keeping humans physically alive isn’t much more expensive and I expect a bunch of the effort to keep humans alive to be motivated by fulfilling their preferences (in a non-bastardized form) rather than by something else.
Intuitively, I feel tempted to call it not death if people would be fine with it on reflection but it seems like a mess and either way not that important.
What about people who want you to not do things to the humans that they consider as bad as death (at least without further reflection).
Nod, I think this is both fine, and, also, resolving the other which way would be fine.
On my end the crux is more like “the space of things aliens could care about is so vast, it just seems so unlikely for it to line up exactly with the preferences of currently living humans.” (I agree “respect boundaries” is a schelling value that probably has disproportionate weight, but there’s still a lot of degree of freedom of how to implement that, and how to trade for it, and whether acausal economies have a lot of Very Oddly Specific Trades (i.e. saving a very specific group) going on that would cover it.
The question of whether “nonconsensual uploads that you maybe endorse later” is a question I end up focused on mostly because you’re rejecting the previous paragraph,
I agree that’s a thing, just, there’s lots of other things aliens could want.
(Not sure if cruxy, but, I think the aliens will care about respecting our agency more like the way we care about respecting trees agency, than the way we care about respecting dogs agency)
Or: “we will be more like trees than like dogs to them.” Seems quite plausible they might be more wisely benevolent towards us than humans are towards trees currently.
But, it seems like an important intuition pump for how they’d be engaging with us and what sort of moral reflection they’d have to be doing.
i.e. on the “bacteria → trees → cats → humans → weakly superhuman LLM → … ??? … → Jupiter Brain that does acausal trades” spectrum of coherent agency and intelligence, it’s not obvious we’re more like Jupiter Brains or like trees.
(somewhere there’s a nice Alex Flint post about how you would try to help a tree if you were vaguely aligned to it)