Lessons from the Cold War on Information Hazards: Why Internal Communication is Critical

Due to their tremendous power, nuclear weapons were a subject of intense secrecy and taboo in the US government following WW2. After their first uses, President Truman came to consider atomic weapons a terror weapon of last resort and generally avoided discussion of their use.1 As a consequence, top-level decision makers in his administration (himself included), outside the Atomic Energy Commision (AEC) remained unaware of the US nuclear stockpile size as it grew,2 and relatively few means of achieving arms control were considered at a high level, given Stalin’s resistance to opening up the Soviet Union to inspections.3 Frustrated by failed negotiations and geopolitical expansion by the communists, US atomic weapons were made ready for use again in 1947,4 and no arms control talks were attempted again beyond a propaganda level for the next ten years.5

The consequences of a lack of internal communication, however, did not end with Truman. Somewhat insulated from Eisenhower’s stance against nuclear first use,6 Curtis LeMay’s plans with Strategic Air Command grew increasingly toward it.7 Growing from a 1000 weapon air force to an 18,000 weapon triad8 with up to potentially 27,000 megatons of explosive power by 19609 the arsenal went far beyond the necessity needed for Eisenhower’s massive retaliation strategy. If the warheads were equally sized and distributed, an airburst attack with such a number of weapons would have been capable of scorching more than half of the Soviet wilderness or shattering windows over more than 90% of the USSR, China, and North Korea.10 While one might argue that such a degree of force may have been rational for accounting for reliability, or retaining significant retaliatory capability after a first strike by the Soviet Union, LeMay’s plans presumed using nearly all the weapons in a preemptive strike11 as US bombers were vulnerable on open runways. Generally speaking, nuclear war planners of the 50s were uncoordinated, didn’t account for many of the known effects of nuclear weapons, and were given no war objectives to plan toward from which limits could be derived.12 Though Eisenhower learned that SAC’s plans were excessive even for counterforce, Eisenhower didn’t commission any study on the lasting environmental effects of such a massive use of force.13

With secrecy and taboo around nuclear weapons, each party involved in planning their use and policy ended up with a great deal of autonomy, where they could make mistakes which would be impossible in a higher feedback environment. With a lack of idea spread between different parts of government, the Strategic Air Command could pursue plans and procurements that were not necessarily efficient or aligned with US political interests. Though analysts at the RAND Corporation had determined counterforce strategies were problematic,14 as SAC didn’t share it’s intelligence abilities, it could feel overly confident dismissing such analysis.15

Since the US’s open society likely had greater vulnerability to Soviet spies than vice versa, it makes sense that the US was concerned about leaks which could give the USSR an advantage or create public pressure to force the president into taking non-strategic actions. In general, compartmentalizing information is a great way to reduce the risk of spies leaking a decisive amount of information, but cutting information flow at the highest levels of government directly harms the government’s ability to make decisions on the basis of the classified information it possesses. If one can’t trust high-level planners enough to give them access to the information needed to make rational plans, such people should not have been planners in the first place. If too many people are required to make good plans securely, then compartmentalization should take place based on accomplishing smaller objectives and creating models from which plans can be derived. Instead, compartmentalization happened by military service branch and agency, resulting in duplication of planning effort, and too many inexperienced people being involved without criticism.

To characterize the disunity of intelligence and communication during the Cold War some in the defense community use this joke:16

US Air Force: “The Russians are here!”

Defense Intelligence Agency: “The Russians are not here yet, but they are coming.”

CIA: “The Russians are trying, but they won’t make it.”

Intelligence and Research (INR, in the Department of State): “The Russians? They aren’t even trying.”


For states and research organizations to not fall prey to these sorts of mistakes in areas situations with risks of harmful information spread (information hazards), there are a few principles that seem like they can be derived from this.

1: It is extremely important for people with good ideas to press them within their secure communities in order to improve decisions. Ideas kept secret and immune from criticism are likely to be pretty bad or ill-conceived, but if there is risk from public engagement (eg. by spreading knowledge of a bioweapon that may have become easier to manufacture) then such conversations should still be had within secure communities to better flesh out the real risks and create strategies for mitigating them.

2: The bounds of secrecy should be well defined. Generally stifling conversation shuts down both useful and harmful information flow, so making good delineations can result in a net reduction in risk. Secrecy can be abused by groups to gain information advantages against competing interest groups, to increase one’s social status, or to maintain corruption. For this reason, cultures of openness and transparency can sometimes develop a net advantage against more secretive ones since they better align the incentives of those involved.


Footnotes:

  1. Rosenberg, D. A. (1983). The Origins of Overkill: Nuclear Weapons and American Strategy, 1945-1960. International Security, 7(4), 11.

  2. ibid., 11

  3. McGeorge Bundy, Danger and Survival, (New York: Random House, 1988), 130

  4. ibid., 339

  5. ibid., 130

  6. ibid., 252

  7. ibid., 322

  8. ibid., 319

  9. ibid., 320

  10. Calculation made using land area estimations from google maps and Alex Wellerstein’s nukemap app using 1.5 Megaton airburst weapons. Note that as all weapons would not be uniform, the actual maximum area damage would be lower.

  11. Bundy 322

  12. Moore, John H. (14 February 1957). “Letter from Captain John H. Morse, Special Assistant to the chairman, Atomic Energy Commission, to Lewis Strauss, Chairman, Atomic Energy Commission”. In Burr, William. “It Is Certain There Will be Many Firestorms”: New Evidence on the Origins of Overkill (PDF). Electronic Briefing Book No. 108 (Report). George Washington University National Security Archive. Dwight D. Eisenhower Library, Records of Special Assistant for National Security Affairs, NSC Series, Briefing Notes Subseries, box 17, Target Systems (1957–1961).

  13. Bundy 324-325

  14. Andrew David May, ‘The RAND Corporation and the Dynamics of American Strategic Thought, 1946–1962’, PhD Dissertation, Emory Univ. 1998, 235, 291.

  15. Austin Long & Brendan Rittenhouse Green (2015) Stalking the Secure Second Strike: Intelligence, Counterforce, and Nuclear Strategy, Journal of Strategic Studies, 38:1-2, 44, DOI: 10.1080/​01402390.2014.958150

  16. Johnson, L. K. (2008). Glimpses into the Gems of American Intelligence: The President’s Daily Brief and the National Intelligence Estimate. Intelligence and National Security, 23(3), 333-370. doi:10.1080/​02684520802121257