I think you have it completely backwards, SIA isn’t based on egotism, but precisely the reverse. You’re more likely, as a generic observer, to exist in a world with more generic observers, because you AREN’T special, and in the sense of being just a twinkle of a generic possible person, could be said to be equally all 99 people in a 99 people world.
You are more likely to be in a world with more people because it’s a world with more of YOU.
Here’s the problem. YOU’RE the egoist, in the sense that you’re only tallying the score of one random observer out of 99, as though the other 98 don’t matter. We have a possible world where one person is right or wrong, and a possible world where 99 people are right or wrong, but for some reason you only care about 1 of those 99 people.
EDIT: more talking
Under anthropic reasoning, if we flip a coin, and create 5 observers if it’s heads, or 95 observers if it’s tails, and if all you know is that you are an observer created after the coin flip, the way you guess which of the 100 possible observers you are is to pick randomly among them, giving you a 5% chance of being a heads observer and a 95% chance of being a tails observer.
Under nonanthropic reasoning, it’s a little more complicated. We have to stretch the probabilities of being the 5 tails-world observers so that they take up as much probability space as the 95 heads-world observers. Because, so the thinking goes, your likelihood to be in a possible world doesn’t depend on the number of observers in that world. Unless the number is zero, then it does. Please note that this special procedure is performed ONLY when dealing with situations involving possible worlds, and not when both worlds (or hotels, or whatever) actually exist. This means that nonanthropic reasoning depends on the many-worlds interpretation of quantum mechanics being false, or at least, if it’s false, coin flips go back to being covered by anthropic reasoning and we have to switch to situations that are consequent on some digit of pi or something.
This smells a little fishy to me. It seems like there’s a spanner in the works somewhere, ultimately based on a philosophical objection to the idea of a counterfactual observer, which results in a well-hidden but ultimately mistaken kludge in which certain data (the number of observers) is thrown out under special circumstances (the number isn’t zero and they only exist contingent on some immutable aspect of the universe which we do not know the nature of, such as a particular digit of pi).
I think you have it completely backwards, SIA isn’t based on egotism, but precisely the reverse. You’re more likely, as a generic observer, to exist in a world with more generic observers, because you AREN’T special, and in the sense of being just a twinkle of a generic possible person, could be said to be equally all 99 people in a 99 people world.
You are more likely to be in a world with more people because it’s a world with more of YOU.
Here’s the problem. YOU’RE the egoist, in the sense that you’re only tallying the score of one random observer out of 99, as though the other 98 don’t matter. We have a possible world where one person is right or wrong, and a possible world where 99 people are right or wrong, but for some reason you only care about 1 of those 99 people.
EDIT: more talking
Under anthropic reasoning, if we flip a coin, and create 5 observers if it’s heads, or 95 observers if it’s tails, and if all you know is that you are an observer created after the coin flip, the way you guess which of the 100 possible observers you are is to pick randomly among them, giving you a 5% chance of being a heads observer and a 95% chance of being a tails observer.
Under nonanthropic reasoning, it’s a little more complicated. We have to stretch the probabilities of being the 5 tails-world observers so that they take up as much probability space as the 95 heads-world observers. Because, so the thinking goes, your likelihood to be in a possible world doesn’t depend on the number of observers in that world. Unless the number is zero, then it does. Please note that this special procedure is performed ONLY when dealing with situations involving possible worlds, and not when both worlds (or hotels, or whatever) actually exist. This means that nonanthropic reasoning depends on the many-worlds interpretation of quantum mechanics being false, or at least, if it’s false, coin flips go back to being covered by anthropic reasoning and we have to switch to situations that are consequent on some digit of pi or something.
This smells a little fishy to me. It seems like there’s a spanner in the works somewhere, ultimately based on a philosophical objection to the idea of a counterfactual observer, which results in a well-hidden but ultimately mistaken kludge in which certain data (the number of observers) is thrown out under special circumstances (the number isn’t zero and they only exist contingent on some immutable aspect of the universe which we do not know the nature of, such as a particular digit of pi).