You’re an equivalence class. You don’t save the last eight billion humans, you save eight billion humans in each of the infinitely many worlds in which your decision algorithm is instantiated.
Why is that significant? No matter how many worlds I’m saving eight billion humans in, there are still humans left over who are saved no matter what I do or don’t do. So the “reward” of my actions still gets downgraded from “preventing human extinction” to “saving a bunch of people, but humanity will be safe no matter what”.
In fact...hmm...any given human will be instantiated in infinitely many worlds, so you don’t actually save any lives. You just increase those lives’ measure, which is sort of hard to get excited about.
You’re an equivalence class. You don’t save the last eight billion humans, you save eight billion humans in each of the infinitely many worlds in which your decision algorithm is instantiated.
Why is that significant? No matter how many worlds I’m saving eight billion humans in, there are still humans left over who are saved no matter what I do or don’t do. So the “reward” of my actions still gets downgraded from “preventing human extinction” to “saving a bunch of people, but humanity will be safe no matter what”.
In fact...hmm...any given human will be instantiated in infinitely many worlds, so you don’t actually save any lives. You just increase those lives’ measure, which is sort of hard to get excited about.