The use of violence in case of violations of the NPT treaty has been fairly limited and highly questionable in international law. And, in fact, calls for such violence are very much frowned upon because of fear they have a tendency to lead to full scale war.
No one has ever seriously suggested violence as a response to potential violation of the various other nuclear arms control treaties.
No one has ever seriously suggested running a risk of nuclear exchange to prevent a potential treaty violation. So, what Yudkowsky is suggesting is very different than how treaty violations are usually handled.
Given Yudkowsky’s view that the continued development of AI has an essentially 100% probability of killing all human beings, his view makes total sense—but he is explicitly advocating for violence up to and including acts of war. (His objections to individual violence mostly appear to relate to such violence being ineffective.)
The more interesting question is where else do we see something similar occurring?
For example, historically income in retirement was usually discussed in terms of expected value. More recently, we’ve begun to see discussions about retirement focusing on the probability of running out of money. Are there other areas where people focus on expected outcomes as opposed to the probability of X occurring?