A reminder from Hawaii

By Lauren J. Borja, M.V. Ramana | January 17, 2018

On January 13, the residents of Hawaii received a startling phone alert: “BALLISTIC MISSILE THREAT INBOUND TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.” Within seconds, fear spread across the island state as people sought shelter.

At the Hawaii Emergency Management Agency (HI-EMA), a different type of panic was sweeping through the building. Employees soon realized that procedures to officially cancel a ballistic missile alert had not yet been created.

Twelve minutes after the alert was sent out, statements identifying the alert as a false alarm were released on Twitter or through news apps. Many Hawaiians either didn’t receive or believe these posts. HI-EMA took 38 minutes to officially retract the alarm.

The false alarm in Hawaii is one of those rare instances when the public gets a peek into the secretive world of nuclear weapons. Within this world, errors and malfunctions of the kind witnessed in Hawaii occur in various other systems associated with the nuclear arsenal. How often they occur remains a tightly-guarded secret, but we know that they do happen.

One part of the arsenal where errors are of particular concern is the early warning system. This system consists of components like satellites, radars, and so on meant to detect signs of missiles coming towards the United States, warning those managing the nuclear arsenal of impending attack. If this warning is deemed credible, the current nuclear posture of the United States calls for its own missiles to be launched at very short notice. (Russia has its own early warning system and its own official policy regarding its response.)

The early warning system is highly complex and involves thousands of components; a vast fleet of people are involved in its operations and maintenance. Components could fail or malfunction. Almost by definition, all these people are capable of human errors. The very limited amount of publicly available information shows that the system does experience errors and false alarms.

Information obtained using a Freedom of Information Act request reveals that from 1977 through 1984, the early warning systems gave an average of 2,598 warnings each year of potential incoming missiles attacks, with about five percent of these requiring further evaluation, according to Bruce G. Blair’s book, The Logic of Accidental Nuclear War. None of those warnings was caused by an actual incoming missile. All of them were, by that definition, false.

Some of these antiquated systems have likely been upgraded to eliminate such a high rate of warnings. But not necessarily: as tensions between nuclear North Korea and the US continue to rise, old warning systems have been brought back online. The big question is how these systems, whether or not they have been upgraded, will operate in today’s connected world, when exposed to modern information infrastructure. Many think tanks and government entities have raised concerns about cyberattacks on early warning systems. A cyberattack could have led to the fraudulent mobile alert evacuation order that was issued in September of 2017 to US military personnel and their families stationed in South Korea.

False alarms also raise the question of the reaction of the target audience. In Hawaii, the signal was explicit: “SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL.” Many people took that seriously—as well they should have—and scrambled to get themselves and their loved ones to places that they thought, or were told, would be safe. Confronted with conflicting information on less official channels, such as Facebook or Twitter, many innocently continued to stay in position until receiving the official cancellation message.

Instead of dismissing this as naiveté, compare it to the response of National Security Advisor Zbigniew Brzezinski in November of 1979. A phone call woke Brzezinski one morning at 3 a.m.; on the other end of the line was his military assistant and crisis officer William Odom, who informed him that Soviet missiles had been fired at the United States. Brzezinski told Odom to call him back when the information had been confirmed but also told him to “Make sure the Strategic Air Command proceeds to take off.” In other words, nuclear weapons were to be readied for launch. One minute before Brzezinski intended to inform President Jimmy Carter, Odom called back to say that the warning was erroneous—a training tape that simulated an attack had been mistakenly played on the computer. Brzezinski’s last words to Odom that night: “Make sure the Strategic Air Command is called back.” One shudders to think of what might have happened if Odom had forgotten to do that, or if there was a problem with communications at either end.

This example illustrates how the US command and control system operates: preparedness for immediate retaliation. This posture is based on receipt of warning, not confirmation of an actual nuclear explosion, and has been called launch on warning (LOW) or launch under attack (LUA). Countries with these policies in place—such as the United States or Russia—quickly move forward with plans to launch missiles of their own in response, until the warning is dismissed. Because these policies are designed to minimize the decision time, so that nuclear weapons can be launched before the suspected attack destroys them, they leave little time for measured judgment, or even thorough verification of warning messages.

The brevity of decision-time is also due to how quickly missiles travel. Even if the missile were to be coming from half-way across the world, the total flight time would be less than the 38 minutes the people in Hawaii had the mistaken belief that a nuclear attack was imminent. A missile coming from a shorter distance—say, from a submarine in the Atlantic Ocean—would be quicker. The actual time that the present-day equivalents of Brzezinski and Carter would have to make the critical decision—to order the launching of US nuclear armed ballistic missiles or not—would be no more than a few minutes.

Should anyone be allowed—or compelled—to make such momentous and perilous decisions? Should missiles be kept in a state where they could be fired off at such short notice? Should we risk nuclear war, especially one that no one intends to start? The answer, as Nobel Prize-winning writer Bob Dylan’s song goes, is blowing in the wind. What happened in Hawaii reminds us to pay attention.


Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of
guest
0 Comments
Inline Feedbacks
View all comments