And, from our understanding of your political goals, “treaty” is probably more like the thing you actually want. But, either is kinda reasonable to say, in this context.
What I understand Yudkowsky and Soares want might be summarized as something like:
“Large numbers of GPUs should be treated like large numbers of uranium gas centrifuges.”
“Publishing details of certain AI algorithms should be treated like publishing detailed engineering guidelines for improving the yield of a nuclear device.”
“Researching certain kinds of AI algorithms should be treated like doing gain of function research into highly contagious airborne Ebola.” Actually, we probably don’t take bio threats nearly as seriously as we should.
The thing they want here includes a well-written treaty. Or an “international agreement.” If you buy their assumptions, then yes, you would want to lay out bright lines around things like data center capacity and monitoring, chip fabs, and possibly what kind of AI research is publishable.
But nuclear deterence also has quite a few other moving parts beyond the international agreements, including:
The gut knowledge of the superpowers that if they screw this up, then their entire civilization dies.[1]
A tense, paranoid standoff between the key players.[2]
A system of economic sanctions strongly backed by major powers.
Quiet conversations between government officials and smart people where the officials say things like, “Pretty please never mention that idea again.”[3]
The key point in all of these circumstances is that powerful people and countries believe that “If we get this wrong, we might die.” This isn’t a case of “We want 80% fewer of our cities to be blown up with fusion bombs”. It’s a case of “We want absolutely none of our cities blown up by fusion bombs, because it won’t stop with just one or two.”
And so the rule that “You may only own up to X uranium gas centrifuges” is enforced using multiple tools, ranging from treaties/agreements to quiet requests to unilateral exercises of state power.
Possibly while singing “Duck and Cover”. Which is actually decent advice for a nuclear war. Think of a nuclear explosion as a cross between a tornado and a really bright light that kills you. Getting away from a window and under a desk is not the worst heuristic, and even simple walls provide some shielding against gamma radiation. Sadly, this probably doesn’t work against SkyNet, no matter what the meme suggests. But an entire generation of children saw these videos and imagined their deaths. And some of those people still hold power. When the last of them retire, nuclear deterence will likely weaken.
Thinking about the advisor’s comments:
What I understand Yudkowsky and Soares want might be summarized as something like:
“Large numbers of GPUs should be treated like large numbers of uranium gas centrifuges.”
“Publishing details of certain AI algorithms should be treated like publishing detailed engineering guidelines for improving the yield of a nuclear device.”
“Researching certain kinds of AI algorithms should be treated like doing gain of function research into highly contagious airborne Ebola.”Actually, we probably don’t take bio threats nearly as seriously as we should.The thing they want here includes a well-written treaty. Or an “international agreement.” If you buy their assumptions, then yes, you would want to lay out bright lines around things like data center capacity and monitoring, chip fabs, and possibly what kind of AI research is publishable.
But nuclear deterence also has quite a few other moving parts beyond the international agreements, including:
The gut knowledge of the superpowers that if they screw this up, then their entire civilization dies.[1]
A tense, paranoid standoff between the key players.[2]
A system of economic sanctions strongly backed by major powers.
Mysterious bad things happening to uranium centrifuges.
Quiet conversations between government officials and smart people where the officials say things like, “Pretty please never mention that idea again.”[3]
The key point in all of these circumstances is that powerful people and countries believe that “If we get this wrong, we might die.” This isn’t a case of “We want 80% fewer of our cities to be blown up with fusion bombs”. It’s a case of “We want absolutely none of our cities blown up by fusion bombs, because it won’t stop with just one or two.”
And so the rule that “You may only own up to X uranium gas centrifuges” is enforced using multiple tools, ranging from treaties/agreements to quiet requests to unilateral exercises of state power.
Possibly while singing “Duck and Cover”. Which is actually decent advice for a nuclear war. Think of a nuclear explosion as a cross between a tornado and a really bright light that kills you. Getting away from a window and under a desk is not the worst heuristic, and even simple walls provide some shielding against gamma radiation. Sadly, this probably doesn’t work against SkyNet, no matter what the meme suggests. But an entire generation of children saw these videos and imagined their deaths. And some of those people still hold power. When the last of them retire, nuclear deterence will likely weaken.
“The whole point of a Doomsday machine is lost if you keep it a secret!”
This is paraphrased, but it’s from a real example.