I like your succinct way of restating the case for spending some money on catastrophes other than AI. It is possible that a loss of industry could be beneficial in the long term. One can adjust the moral hazard parameter to take into account this possibility. However, it does subject us to more natural risk like supervolcanic eruptions and asteroid/comet impacts. And if we actually lost anthropological civilization, we would not be doing any AI safety work. Even just losing industry for a long time I think would make most AI safety work not feasible, but I am interested in your thoughts. Without industry, we would not be able to afford nearly as many researchers. And they would just be doing math on paper.
I like your succinct way of restating the case for spending some money on catastrophes other than AI.
It is possible that a loss of industry could be beneficial in the long term. One can adjust the moral hazard parameter to take into account this possibility. However, it does subject us to more natural risk like supervolcanic eruptions and asteroid/comet impacts. And if we actually lost anthropological civilization, we would not be doing any AI safety work. Even just losing industry for a long time I think would make most AI safety work not feasible, but I am interested in your thoughts. Without industry, we would not be able to afford nearly as many researchers. And they would just be doing math on paper.