This suggests an interesting strategy for altruistic futurists: Instead of trying to predict how trends are going to go and attempting to influence them from the better, just accumulate money in a donor-advised fund (or probably a more flexible investment vehicle, really) and take it out in the event of a true emergency. (“Civilization’s rainy-day money.”)
Some thoughts on this idea: It might be a good idea to establish a mailing list to coordinate with others doing the same thing. For certain unpleasant scenarios, the government can be counted on to intervene. But it’s not obvious that their intervention will be quick or effective. Given government’s existence, however, our comparative advantage might be in intervening before things reach a full-scale crisis.
One existing group along these lines is the Lifeboat Foundation, which claims to be attempting to establish a backup system for mankind, but it’s not clear to me to what extent they actually do anything.
I do think the “Civilization’s rainy-day money” idea is a good one in principle but I fear it would be expended on tragic but non-existential threats (like the latest big earthquake/hurricane/tsunami) rather than saved for existential risks.
Further, in the event that an existential risk did become apparent I am not sure that having “rainy-day money” would really enhance our response, simply in that we might not have enough time to spend the money on useful projects.
I do think the “Civilization’s rainy-day money” idea is a good one in principle but I fear it would be expended on tragic but non-existential threats (like the latest big earthquake/hurricane/tsunami) rather than saved for existential risks.
I’m suggesting this as something for LW users to do.
Further, in the event that an existential risk did become apparent I am not sure that having “rainy-day money” would really enhance our response, simply in that we might not have enough time to spend the money on useful projects.
Yeah, some degree of forecasting is probably a good idea.
I’m suggesting this as something for LW users to do.
I know, but I don’t have a huge degree of confidence in this gathering enough funds in order to be meaningful if it were LW-users only; to scale up to the level where it would be able to actually influence these risks substantially I think it would have to draw in money (and hence influence) from outsiders.
This suggests an interesting strategy for altruistic futurists: Instead of trying to predict how trends are going to go and attempting to influence them from the better, just accumulate money in a donor-advised fund (or probably a more flexible investment vehicle, really) and take it out in the event of a true emergency. (“Civilization’s rainy-day money.”)
Some thoughts on this idea: It might be a good idea to establish a mailing list to coordinate with others doing the same thing. For certain unpleasant scenarios, the government can be counted on to intervene. But it’s not obvious that their intervention will be quick or effective. Given government’s existence, however, our comparative advantage might be in intervening before things reach a full-scale crisis.
One existing group along these lines is the Lifeboat Foundation, which claims to be attempting to establish a backup system for mankind, but it’s not clear to me to what extent they actually do anything.
I do think the “Civilization’s rainy-day money” idea is a good one in principle but I fear it would be expended on tragic but non-existential threats (like the latest big earthquake/hurricane/tsunami) rather than saved for existential risks.
Further, in the event that an existential risk did become apparent I am not sure that having “rainy-day money” would really enhance our response, simply in that we might not have enough time to spend the money on useful projects.
I’m suggesting this as something for LW users to do.
Yeah, some degree of forecasting is probably a good idea.
I know, but I don’t have a huge degree of confidence in this gathering enough funds in order to be meaningful if it were LW-users only; to scale up to the level where it would be able to actually influence these risks substantially I think it would have to draw in money (and hence influence) from outsiders.