But perhaps the bigger reason is that I find SIA intuitively extremely obvious. It’s just what you get when you apply Bayesian reasoning to the fact that you exist.
Correct, except for the fact that you’re failing to consider the possibility that you might not exist at all...
My entire uncertainty in anthropic reasoning is bound up in the degree to which an “observer” is at all a coherent concept.
Actually, it’s the other way around. SIA always assumes that one may not have existed at all. This is the source of the Bayesian update and this itself may be a problem.
Basically, it requires assuming that all existing humans were randomly sampled from a finite set of immaterial souls—a pretty extraordinary claim about the way our universe works, without any evidence to support it.
Correct, except for the fact that you’re failing to consider the possibility that you might not exist at all...
My entire uncertainty in anthropic reasoning is bound up in the degree to which an “observer” is at all a coherent concept.
Actually, it’s the other way around. SIA always assumes that one may not have existed at all. This is the source of the Bayesian update and this itself may be a problem.
Basically, it requires assuming that all existing humans were randomly sampled from a finite set of immaterial souls—a pretty extraordinary claim about the way our universe works, without any evidence to support it.