I don’t believe anyone can assign meaningfulvery small or very large probabilities in most situations. It is one of my long-running disagreements with people here and on OB.
I don’t know of a unified way of handling extremely small risks, but there are two things that can be helpful. First, as suggested by Marc Stiegler in “David’s Sling”, is to simply recognize explicitly that they are possible, that way if they do occur you can get on with dealing with the problem without also having to fight disbelief that it could have happened at all. Second, different people have different perspectives and interests and will treat different low possibility events differently, this sort of dispersion of views and preparation will help ensure that someone is at least somewhat prepared. As I said, neither of these is really enough, but I simply can’t see any better options.
I’m saying “Black Swan” to compress the following message: We cannot assign probability at all because we don’t have statistics. Nevertheless, the stakes are so high that we should be overly cautious. We need the book “just in case”. It’s a very specific, actionable step in existential threat mitigation. Unlike other measures it requires no new discoveries but just a modest investment of money and time.
You have to assign probabilities anyway. See the amended article:
Considering some event a black swan doesn’t give a leave to not assign any probabilities, since making decisions depending on the plausibility of such event is still equivalent to assigning probabilities that make the expected utility calculation give those decisions.
Okay, okay! How much our civilization is worth? Say, 10^20 dollars. If I had the money, I would be willing to part with 10^6 dollars to develop, manufacture, and distribute the book. Therefore, the probability of the book serving it’s primary purpose is 10^(-14).
How much our civilization is worth? Say, 10^20 dollars.
That’s meaningless. You can’t assign a value in dollars to the continued existence of our civilization. Dollars are only useful for pricing things inside that civilization. (Some people argue for using utilons to price the civilization’s existence.)
If I had the money, I would be willing to part with 10^6 dollars to develop, manufacture, and distribute the book. Therefore, the probability of the book serving it’s primary purpose is 10^(-14).
The amount you’re willing to pay is a fact about you, not about the book’s usefulness. You’re saying you estimate its probability of usefulness at 10^-14. But why?
Black Swan.
Just saying “black swan” isn’t enough to give higher probability. If you think I can’t assign any meaningful probability at all to this scenario, why?
I don’t believe anyone can assign meaningful very small or very large probabilities in most situations. It is one of my long-running disagreements with people here and on OB.
There are indeed many known human biases of this kind, plus general inability to predict small differences in probability.
But we can’t treat every low probability scenario as being e.g. of p=0.1 or some other constant! What do you suggest then?
I don’t know of a unified way of handling extremely small risks, but there are two things that can be helpful. First, as suggested by Marc Stiegler in “David’s Sling”, is to simply recognize explicitly that they are possible, that way if they do occur you can get on with dealing with the problem without also having to fight disbelief that it could have happened at all. Second, different people have different perspectives and interests and will treat different low possibility events differently, this sort of dispersion of views and preparation will help ensure that someone is at least somewhat prepared. As I said, neither of these is really enough, but I simply can’t see any better options.
I’m saying “Black Swan” to compress the following message: We cannot assign probability at all because we don’t have statistics. Nevertheless, the stakes are so high that we should be overly cautious. We need the book “just in case”. It’s a very specific, actionable step in existential threat mitigation. Unlike other measures it requires no new discoveries but just a modest investment of money and time.
You have to assign probabilities anyway. See the amended article:
Okay, okay! How much our civilization is worth? Say, 10^20 dollars. If I had the money, I would be willing to part with 10^6 dollars to develop, manufacture, and distribute the book. Therefore, the probability of the book serving it’s primary purpose is 10^(-14).
That’s meaningless. You can’t assign a value in dollars to the continued existence of our civilization. Dollars are only useful for pricing things inside that civilization. (Some people argue for using utilons to price the civilization’s existence.)
The amount you’re willing to pay is a fact about you, not about the book’s usefulness. You’re saying you estimate its probability of usefulness at 10^-14. But why?
Clearly the market for civilization creation books is efficient.
Nice point. Maybe we should instead talk about scenarios where humanity (including us) no longer suffers aging but a collapse still occurs.
Incidentally, I wonder what the market price for writing a civilization-destroying book might be?
I believe the going rate is 45 virgins in the afterlife.