Thanks for error pointing. I don’t think that anyone actually plotting nuclear strikes on AI labs. However, Putin or Kim could think that it is their only chance to preserve power in AI age. Anyways, it can’t be regarded as success. It is either a global catastrophe or a short delay.
But I will think how to add estimation to successfull AI ban.
A bit gloomy as only global catastrophe or delayed catastrophe.
I thought we had 40 years but with Elon Musk talking about 7-8 years for an AGI and with the recent 4 hour training to get to [world chess supremacy][chess] I am not so sure. So I think we need to buy some time. Even if you can’t destroy the semiconductor fabs you could still increase taxes. this could be marketed as helping to pay for societies dislocation while we undergo job losses.
However, I think that war will only increase extinction risks—and even AI risks, as it will increase arms race and stops ethical thinking. Also, a strike on Silicon valley will kill best minds in AI safety, but some obscure chineese labs will continue to exist.
Thanks for error pointing. I don’t think that anyone actually plotting nuclear strikes on AI labs. However, Putin or Kim could think that it is their only chance to preserve power in AI age. Anyways, it can’t be regarded as success. It is either a global catastrophe or a short delay.
But I will think how to add estimation to successfull AI ban.
A bit gloomy as only global catastrophe or delayed catastrophe.
I thought we had 40 years but with Elon Musk talking about 7-8 years for an AGI and with the recent 4 hour training to get to [world chess supremacy][chess] I am not so sure. So I think we need to buy some time. Even if you can’t destroy the semiconductor fabs you could still increase taxes. this could be marketed as helping to pay for societies dislocation while we undergo job losses.
I also think that there is only several years until dangerous AI.
See my presentation about it:
https://www.academia.edu/34863051/Near-term
However, I think that war will only increase extinction risks—and even AI risks, as it will increase arms race and stops ethical thinking. Also, a strike on Silicon valley will kill best minds in AI safety, but some obscure chineese labs will continue to exist.
Thank you for that reference. I hadn’t seen a quantification of the Bitcoin computer capacity which was interesting and high.