“The two scenarios involving Omega are only meant to establish that a late great filter should not be considered worse news than an early great filter.”
I honestly think this would have been way, way, way clearer if you had dropped the Omega decision theory stuff, and just pointed out that, given great filters of equal probability, choosing an early great filter over a late great filter would entail wiping out the history of humanity in addition to the galactic civilization that we could build, which most of us would definitely see as worse.
Point taken, but I forgot to mention that the Omega scenarios are also meant to explain why we might feel that the great filter being late is worse news than the great filter being early: an actual human, faced with the decision in scenario 2, might be tempted to choose the early filter.
I’ll try to revise the post to make all this clearer. Thanks.
“The two scenarios involving Omega are only meant to establish that a late great filter should not be considered worse news than an early great filter.”
I honestly think this would have been way, way, way clearer if you had dropped the Omega decision theory stuff, and just pointed out that, given great filters of equal probability, choosing an early great filter over a late great filter would entail wiping out the history of humanity in addition to the galactic civilization that we could build, which most of us would definitely see as worse.
Point taken, but I forgot to mention that the Omega scenarios are also meant to explain why we might feel that the great filter being late is worse news than the great filter being early: an actual human, faced with the decision in scenario 2, might be tempted to choose the early filter.
I’ll try to revise the post to make all this clearer. Thanks.