E.g. if you have a broad distribution over possible worlds, some of which are “fragile” and have 100 things that cut value down by 10%, and some of which are “robust” and don’t, then you get 10,000x more value from the robust worlds. So unless you are a priori pretty confident that you are in a fragile world (or they are 10,000x more valuable, or whatever), the robust worlds will tend to dominate.
This is only true if you assume that there is an equal number of robust and fragile worlds out there, and your uncertainty is strictly random, i.e. you’re uncertain about which of those worlds you live in.
I’m not super confident that our world is fragile, but I suspect that most worlds look the same. I.e., maybe 99.99% of worlds are robust, maybe 99.99% are fragile. If it’s the latter, then I probably live in a fragile world.
If it’s a 50% chance that 99.99% of worlds are robust and 50% chance that 99.99% are fragile, then the vast majority of EV comes from the first option where the vast majority of worlds are robust.
This is only true if you assume that there is an equal number of robust and fragile worlds out there, and your uncertainty is strictly random, i.e. you’re uncertain about which of those worlds you live in.
I’m not super confident that our world is fragile, but I suspect that most worlds look the same. I.e., maybe 99.99% of worlds are robust, maybe 99.99% are fragile. If it’s the latter, then I probably live in a fragile world.
If it’s a 50% chance that 99.99% of worlds are robust and 50% chance that 99.99% are fragile, then the vast majority of EV comes from the first option where the vast majority of worlds are robust.
You’re right, the nature of uncertainty doesn’t actually matter for the EV. My bad.
I think it does actually, although I’m not sure how exactly. See Logical vs physical risk aversion.