The test ban treaty at 60: How citizen action made the world safer

By Robert Alvarez, Joseph Mangano | August 4, 2023

Crater left by the 1954 Castle Bravo nuclear weapons test on Bikini Atoll seen from space (Google, Maxar Technologies, Image Landsat / Copernicus, Data SIO, NOAA, US Navy, NGA, GEBCO).

Sixty years ago, almost to the day, in a Cold War world haunted by the specter of nuclear war, negotiators brought large-scale atmospheric nuclear weapons tests to an end. The United States, Soviet Union, and United Kingdom—which had conducted over 500 above-ground tests, with the combined power of 30,000 Hiroshima bombs—agreed to end testing in the atmosphere, under water, and in outer space. France and China, which had detonated a much smaller number of tests, did not sign, but ended all atmospheric tests in 1980. The Limited Test Ban Treaty became the first international environmental treaty curtailing the poisoning of Earth.

In 1945, at the dawn of a new era, just after the first nuclear test explosion in Alamogordo, New Mexico, researchers at Los Alamos National Laboratory reported that “the most worldwide destruction could come from radioactive poisons.” In 1951, the US Atomic Energy Commission (AEC) authorized a study of strontium 90, one of the radioisotopes in nuclear fallout, in bones of deceased humans throughout the world. Within a few years, the commission recognized that while humans in all parts of the world were taking up fallout, some were especially hard-hit, such as the people of the Marshall Islands; and as testing continued, concentrations in the human body increased sharply.

By the early 1950s, AEC leaders were aware of fallout hazards to humans, but chose to keep the public in the dark. For example, at a secret November 1954 meeting, following the explosion of six large hydrogen bomb tests of the “Castle” series, in the Marshall Islands, John C. Burgher, the head of the AEC’s Division of Biology and Medicine, told members of the AEC General Advisory Committee that a radioactive form of iodine “can be detected in thyroids all over the U.S.”  Bugher “cautioned against the use of milk from heavily contaminated areas.” Hot spots 5,000 miles away in the continental United States showed radiation levels more than 2,000 times greater than normal background. One test alone, known as Bravo, had an estimated yield of 15 megatons—the power of 1,000 Hiroshima bombs—and its environmental releases dwarfed those from the subsequent Chernobyl and Fukushima meltdowns.

While information regarding the domestic impacts of fallout from H-bomb tests was classified for decades, it was impossible to hide the fact that fallout from Bravo, exploded on March 1, 1954, caused  immediate and severe harm to Japanese fishermen and to the people of the Rongelap and Uterik Atolls in the Marshall Islands, 200 miles away. In the aftermath of what is described by the US Radiochemical Society as “the worst radiological disaster in US history,” Bravo galvanized public opposition to atmospheric testing—helping pave the way for the Partial Test Ban Treaty.

RELATED:
The fallout never ended

By 1963, nearly two decades of bomb testing had poisoned the air, land, and water with hundreds of radioisotopes, many of which can still be found today, even in the most remote places on earth. Since the 1950s, the word “fallout” became part of the lexicon throughout the world. Notably, plutonium (with its 24,000-year half-life) is so ubiquitous that it is considered a key marker, along with climate change, for a proposed new geological epoch. Known as the Anthropocene Epoch, it begins in the mid-20th century and describes the most significant human impacts affectng the planet.

Over time, scientists, public officials, and citizens raised questions about fallout. How much fallout was entering the body? Was this considered a high level? Was the fallout a risk for cancer and other diseases? Were some humans at greater risk than others? Some scientists were moved to make their concerns public. While nobody knew exactly how much exposure it took to cause cancer, it was clear that fallout posed a potential danger, and that rising levels were a concern.

Calls for a ban on atomic tests, by scientists and elected officials, emerged in the 1950s. Americans and Soviets observed a moratorium on testing between fall 1958 and fall 1961, only to see testing resumed amid growing Cold War tensions. Environmental levels of radioactivity reached new highs, prompting citizens—led by groups such as Women Strike for Peace and the National Committee for a Sane Nuclear Policy—to organize large rallies urging a halt to testing. Their message was frequently focused on the hazards posed to infants and children..

This message was prompted in part by a landmark study of radioactive strontium 90 in baby teeth, organized by citizens and scientists in St. Louis. As protests continued, results of the study showed that strontium 90 levels had tripled in just three years in the early 1950s, and levels in the next decade were much greater (as it turned out, a 60-fold increase from the early 1950s to the mid-1960s). Results were shared with President John F. Kennedy, who cited children “with cancer in their bones, with leukemia in their blood, or with poison in their lungs” as a reason for the treaty.

RELATED:
The fallout never ended

The effects of the treaty were multiple. The immediate reaction to its Aug. 5, 1963 signing was a sigh of relief, just 10 months after the world had come perilously close to nuclear war during the Cuban Missile Crisis. Kennedy described the agreement as “a shaft of light cut into the darkness.” During a tour of traditionally Republican western states, he was startled to hear wild applause from crowds as he announced the treaty and the end of the testing era.

Although the test ban didn’t immediately end bomb testing or end the threat of nuclear war—still very much a reality six decades later—it signaled the reduction of bomb testing, which has essentially ended since the early 1990s. It improved the tone of relations between communist and non-communist leaders; there were no more crises with a serious threat of all-out nuclear war. By the mid-1970s, the two superpowers began to negotiate a series of agreements to reduce nuclear weapon arsenals, a reduction that has now reached about 80 percent from the peak.

A crucial legacy of the treaty is the reduction of fallout levels in the environment and body. Within five years of the agreement, strontium 90 in the milk supply, baby teeth, and human bones fell by more than half, sparing future generations the consequences of early-life exposures to one of the most harmful man-made chemicals in history.

Finally, the events of the two decades that led to the test ban represent the ability of citizen activism, combined with scientific knowledge, to achieve a change in public policy desired by a large majority of people. Although the toxic mix of chemicals in bomb fallout continue to be produced in and released from nuclear reactors, the treaty still stands as a paragon achievement in the continuing effort to foster peace and a healthy environment.


Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of
guest
1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Dr. Xiaodong Jiang
Dr. Xiaodong Jiang
8 months ago

Please do a careful numerical check on the following statement:

Although the toxic mix of chemicals in bomb fallout continue to be produced in and released from nuclear reactors, the treaty still stands as a paragon achievement in the continuing effort to foster peace and a healthy environment.

RELATED POSTS

Receive Email
Updates