I see. I’m not sure that solves the problem for the ems, since I think the biologicals may already have one even by themselves with the ems then copy, but it certainly slows it down. And there is now an extra step where the ems look at something the biologicals chose to change about themselves and presumably have the option to say “we don’t approve, we’re not going to adopt that, and in fact we’re going to influence the biologicals to undo it, because (given that we’re not going to adopt it) it makes them less useful to us”, so it might actually slow the process even for the biologicals. Which doesn’t by itself prove that the process converges to a stable state, it might just mean it diverges more slowly. However, if the ems WANT the biologicals process to converge to a stable state rather than diverging to , they can almost certainly arrange that it does, since fundamentally they have more power.
However, I think the whole ems situation has a different instability, which I discuss in Uploading, so I see the whole situation as already unstable, just with a different failure mode. Very briefly, ems are easy to upgrade, and baseline human moral intuitions and ethical behaviors are not well calibrated for a situation in which some people have orders of magnitude more capability than others: humans are not actually aligned, they’re mereley good at allying between approximate equal, and one you start adding large capability differences between human in a society, things go badly. So if ems upgrade, you either need to keep their capabilities similar, or change their ethics / behavior enough that this isn’t a problem any more, or put a lot of social controls on preventing problems. So basically, ems/uploads have a problem comparable to the AI alignment problem, which similarly would need to be solved first before even became a potential problem.
I see. I’m not sure that solves the problem for the ems, since I think the biologicals may already have one even by themselves with the ems then copy, but it certainly slows it down. And there is now an extra step where the ems look at something the biologicals chose to change about themselves and presumably have the option to say “we don’t approve, we’re not going to adopt that, and in fact we’re going to influence the biologicals to undo it, because (given that we’re not going to adopt it) it makes them less useful to us”, so it might actually slow the process even for the biologicals. Which doesn’t by itself prove that the process converges to a stable state, it might just mean it diverges more slowly. However, if the ems WANT the biologicals process to converge to a stable state rather than diverging to , they can almost certainly arrange that it does, since fundamentally they have more power.
even became a potential problem.
However, I think the whole ems situation has a different instability, which I discuss in Uploading, so I see the whole situation as already unstable, just with a different failure mode. Very briefly, ems are easy to upgrade, and baseline human moral intuitions and ethical behaviors are not well calibrated for a situation in which some people have orders of magnitude more capability than others: humans are not actually aligned, they’re mereley good at allying between approximate equal, and one you start adding large capability differences between human in a society, things go badly. So if ems upgrade, you either need to keep their capabilities similar, or change their ethics / behavior enough that this isn’t a problem any more, or put a lot of social controls on preventing problems. So basically, ems/uploads have a problem comparable to the AI alignment problem, which similarly would need to be solved first before