We have a chemical weapons ban because chemical weapons are obsolete. We have a Biological Weapons Convention too, but I think that’s because today’s biological weapons are like chemical weapons and future biological weapons favor non-state actors. Russia, China and the USA haven’t even joined the Ottawa Treaty which bans landmines. We don’t have a “multilateral, international governmental/legal agreement to halt” the production of nuclear weapons, cyberweapons or autonomous weapons. Nuclear weapons are easier to ban than AI because nuclear weapons require uranium, centrifuges and missiles whereas AI requires an AWS account. The risk of nuclear weapons are well-understood. The risks of AI are not.
If the short timeline theories are correct then the only thing which could slow the development of AI is to break civilization so hard technological progress grinds to a halt. If the short timeline theories are correct then nuclear war could slow down the advent of AI by mere decades.
Not any nuclear war will work. If it will be a nuclear exchange between major superpowers, but many well developed countries survive, it will not slow down AI significantly. Even such war may accelerate AI as it will be seen as a new protective weapon.
Only anti-AI nuclear war which will target chip fabs, data centeres, electricity sources and brain tanks + global economy in general may be “effective” in halting AI development. I don’t endorse it, as the extinction probability here is close to AI’s extinction probability.