1. True, depends on the nature of personal identity. However, if some finite form of identity is true, I should not worry about “hostile resurrection”: that future AI steal information about me and create my copy and will torture me. This closes possibility of many bad outcomes.
2. More likely to work for “instrumental sufferings maximiser”, which may use human sufferings for blackmail. For AI, which has final goal of suffering maximising, where could be some compromise: it allows to torture one person for one second. And as I suggested this idea, I have to volunteer to be this person.
1. True, depends on the nature of personal identity. However, if some finite form of identity is true, I should not worry about “hostile resurrection”: that future AI steal information about me and create my copy and will torture me. This closes possibility of many bad outcomes.
2. More likely to work for “instrumental sufferings maximiser”, which may use human sufferings for blackmail. For AI, which has final goal of suffering maximising, where could be some compromise: it allows to torture one person for one second. And as I suggested this idea, I have to volunteer to be this person.