Agreed. I think a big part of the reason why we saw a large fiscal response in Covid but not in e.g. 2008 was because it was agreed that it was “nobody’s fault”.
In this sense, the faster that AI produces unemployment, the more likely we will see a policy response. If tens of millions of middle class educated workers suddenly wake up one day without a job, politicians will respond. If, on the other hand, AI slowly squeezes the lowest productivity workers out of a job over the course of 1-2 decades, there will be calls for “reeducation” or “tough love” or some such nonsense as the economy slowly spirals downward Japan style.
Ironically, this then makes for one of the few cases where “going faster” makes the transition to AGI less harmful. Whereas most AI safety issues are worse the faster the transition is.
Agreed. I think a big part of the reason why we saw a large fiscal response in Covid but not in e.g. 2008 was because it was agreed that it was “nobody’s fault”.
In this sense, the faster that AI produces unemployment, the more likely we will see a policy response. If tens of millions of middle class educated workers suddenly wake up one day without a job, politicians will respond. If, on the other hand, AI slowly squeezes the lowest productivity workers out of a job over the course of 1-2 decades, there will be calls for “reeducation” or “tough love” or some such nonsense as the economy slowly spirals downward Japan style.
Ironically, this then makes for one of the few cases where “going faster” makes the transition to AGI less harmful. Whereas most AI safety issues are worse the faster the transition is.