25 September 2017

My time with Stanislav Petrov: No cog in the machine

Bruce G. Blair

Bruce G. Blair

Bruce G. Blair is a research scholar at Princeton University’s Program on Science and Global Security. His principal research interests involve steps toward the verifiable elimination of nuclear...


A week ago, news emerged that Col. Stanislav Petrov had died in May. Petrov was sometimes called “the man who saved the world” because, in 1983, as the officer in charge of a Soviet early warning command center, he told his superiors that indications of an apparent US nuclear attack were false. If Petrov had made a different decision, the result might have been all-out nuclear war.

In later years I had a chance to engage in extensive conversations with Petrov—at the United Nations, where he received a World Citizen Award, and in my home, where we spent a day in front of a film crew. Petrov and I crossed the cultural divide and quickly bonded, and we trolled the 1983 incident for insights and lessons.

We agreed that keeping nuclear missiles on hair-trigger alert and preparing to fire them quickly amid an enemy attack posed a real risk of a launch on false warning. This is pretty much conventional wisdom today, but Petrov's experience added texture to an otherwise abstract concern. We discussed the nuclear postures of both the Soviet Union and United States in 1983, and how their vigilance and their belief in the need for rapid reaction to signs of enemy attack made people into automatons. Under extremely tight timelines, there is little or no room for human reasoning and logic. The responses of operators at command centers had become highly conditioned, and almost devoid of discretionary judgment. At the Soviet command center in 1983, when the alarms and sirens announced an imminent US nuclear missile strike, adrenaline kicked in. Operators—including Petrov—became susceptible to panic and paralysis. Hewing closely to checklists helps repress anxiety and prevent total panic. But checklists also militate against creative human problem-solving, and somehow Petrov managed to rise above them.

Petrov, when faced with sudden indications that Minuteman missiles fired from silos in the US heartland were heading toward Soviet targets, responded in an exceptional way—and did so partly because his background was atypical. Unlike other operators in charge of early warning command centers, he had not been immersed in combat culture. He had not been trained and conditioned to respond to warnings by checking boxes and accepting computers’ assessments as final. He was an engineer whose day job was to troubleshoot the main computer that the center used to process data from new and recently deployed satellites (which were designed to detect the fiery plume given off by missiles during the boost phase of their launch). In fact, Petrov had been de-bugging the main computer for several weeks. When he was drafted that night to replace a senior combat officer who could not assume duty, he appreciated the computer’s fallibility.

The United States was fortunate that the individual sitting in the hot seat that night was Petrov—a thinking, skeptical scientist who had knowledge of and experience with the new satellite network and its computer processing equipment. This was rare good luck, but it points to the need to keep thoughtful people in the loop—while recognizing, however, that even intelligent and rational people may panic, or blindly follow checklists, when confronted with indications of a nuclear missile attack. I told Petrov about two major false alarms in the United States in 1979 and 1980. In both cases, large-scale Soviet missile strikes were reported to be under way—and in both cases, the combat crews on duty at the early warning hub inside Colorado’s Cheyenne Mountain became flummoxed. In 1979, it took the crew on duty eight minutes to resolve the situation and declare a false alarm. It was supposed to take only three minutes. The crew members were fired and retrained, and were back on duty when the second incident occurred. They blew it again, taking eight minutes instead of three, and they were fired a second time. Petrov was not surprised by this history.

We also discussed how the state of relations between the United States and Soviet Union at any given moment was a critical factor in assessing the validity of nuclear attack indications. In 1983, US-Soviet relations were at a very low ebb—perhaps the worst they had been since the Cuban Missile Crisis. President Reagan viewed the Soviet Union as an evil empire; promulgated a policy of fighting and winning a protracted nuclear war (before his eventual conversion to the idea that “a nuclear war cannot be won and must never be fought”); announced the Strategic Defense Initiative (which came to be known as “Star Wars”) to defang the Soviet offensive missile threat; and deployed new missiles in Europe that threatened a sudden decapitation strike against the Kremlin. It was a time known as “the war scare.” The Soviets feared a surprise nuclear strike and maintained a hyper-sensitive intelligence and surveillance network to detect early signs of Western aggression, particularly nuclear attack. On hair-trigger alert across the spectrum of military operations, the Soviets had—not long before the alarm sounded in Petrov's ears—mistakenly shot down a Korean civilian airliner that strayed over the far eastern Soviet Union.

This level of tension in the relationship lent credence to the (faulty) assessment by Petrov's computer that a US Minuteman strike was really happening. (The computer’s algorithm was too sensitive to the sun’s reflection off clouds, and was later reset with a higher detection threshold.) Widespread fear bordering on paranoia made operators from the bottom to the top of the chain of command—including Yuri Andropov, the top Soviet leader and a former KGB head, as well as the General Staff—inclined (or predisposed) to believe the worst. After all, they expected the worst. Petrov was not immune to this spreading fear and extreme vigilance, and yet he remained skeptical.

Frankly, some aspects of his idiosyncratic judgment that night rightfully corresponded to a much higher pay grade. Petrov strongly doubted that a small-scale US Minuteman launch could be the leading edge of a full assault to come—but it was the General Staff's responsibility, not Petrov's, to judge whether a US strike could plausibly begin this way. I mentioned to him that some US assessments of Soviet attack profiles assumed that the Kremlin would launch a small-scale opening salvo of missiles that, to prevent US retaliation, would explode in the upper atmosphere and create an electromagnetic pulse that would damage US communications. He seemed not to have considered such possibilities.

Both of us expressed our conviction that only the complete elimination of nuclear weapons by all nations would spare the world their eventual use. Like so many former nuclear commanders who have witnessed first-hand the follies and foibles of the organizations, people, and technology in command of these fearsome weapons, Petrov warned that disaster was inevitable if nuclear weapons existed indefinitely. Who knows how Andropov, Defense Minister Dmitri Ustinov, and the Soviet General Staff would have behaved if Petrov had reported that a nuclear missile strike was definitely under way. They might have waited for further evidence—or they might have cracked open the nuclear "football" and hastily ordered immediate "retaliation" before (nonexistent) incoming US warheads could have destroyed the Kremlin. I thanked Petrov for getting it right—for not being a dumb cog in the doomsday machine.