Suppose a doomsday scenario (whichever one you prefer) comes to pass, and wipes out 99.999999975% of humanity. The last two living humans cower in a bunker and discuss.
“If we imagine ourselves assigned randomly among all of the humans who ever lived, the odds are extremely low that we would by chance happen to be the last two, therefore we must expect another hundred billion or so humans to come after us, to make our place in line unremarkable. Statistically speaking that can’t be the final apocalypse outside.”
One of them, comforted to learn that humanity has a long bright future still ahead of it, leaves the bunker and is immediately dissolved by the ever-encroaching tide of grey goo. The other is quietly amazed by their odds of being not just among the last two but the actual last. But not for very long.
(I don’t think the above holds as actually valid, I really just intend to illustrate that anthropic reasoning seems very difficult to do correctly)
Suppose a doomsday scenario (whichever one you prefer) comes to pass, and wipes out 99.999999975% of humanity. The last two living humans cower in a bunker and discuss.
“If we imagine ourselves assigned randomly among all of the humans who ever lived, the odds are extremely low that we would by chance happen to be the last two, therefore we must expect another hundred billion or so humans to come after us, to make our place in line unremarkable. Statistically speaking that can’t be the final apocalypse outside.”
One of them, comforted to learn that humanity has a long bright future still ahead of it, leaves the bunker and is immediately dissolved by the ever-encroaching tide of grey goo. The other is quietly amazed by their odds of being not just among the last two but the actual last. But not for very long.
(I don’t think the above holds as actually valid, I really just intend to illustrate that anthropic reasoning seems very difficult to do correctly)