Doomsday Clock Overview
The Doomsday Clock conveys how close humanity is to catastrophic destruction--the figurative midnight--and monitors the means humankind could use to obliterate itself. First and foremost, these include nuclear weapons, but they also encompass climate-changing technologies and new developments in the life sciences that could inflict irrevocable harm.
The nuclear age dawned in the 1940s when scientists learned how to release the energy stored within the atom. Immediately, they thought of two potential uses--an unparalleled weapon and a new energy source. The United States built the first atomic bombs during World War II, which they used on Hiroshima and Nagasaki in August 1945. Within two decades, Britain, the Soviet Union, China, and France had also established nuclear weapon programs. Since then, Israel, India, Pakistan, and North Korea have built nuclear weapons as well.
For most of the Cold War, overt hostility between the United States and Soviet Union, coupled with their enormous nuclear arsenals, defined the nuclear threat. The U.S. arsenal peaked at about 30,000 warheads in the mid-1960s and the Soviet arsenal at 40,000 warheads in the 1980s, dwarfing all other nuclear weapon states. The scenario for nuclear holocaust was simple: Heightened tensions between the two jittery superpowers would lead to an all-out nuclear exchange. Today, the potential for an accidental or inadvertent nuclear exchange between the United States and Russia remains, with both countries anachronistically maintaining more than 1,000 warheads on high alert, ready to launch within tens of minutes, even though a deliberate attack by Russia or the United States on the other seems improbable.
Unfortunately, however, in a globalized world with porous national borders, rapid communications, and expanded commerce in dual-use technologies, nuclear know-how and materials travel more widely and easily than before--raising the possibility that terrorists could obtain such materials and crudely construct a nuclear device of their own. The materials necessary to construct a bomb pervade the world--in part due to programs initiated by the United States and Soviet Union to spread civilian nuclear power technology and research reactors during the Cold War.
As a result, according to the International Panel on Fissile Materials, substantial quantities of highly enriched uranium, one of the materials necessary for a bomb, remain in more than 40 non-weapon states. Save for Antarctica, every continent contains at least one country with civilian highly enriched uranium. Even with the improvement of nuclear reactor design and international controls provided by the International Atomic Energy Agency (IAEA), proliferation concerns persist, as the components and infrastructure for a civilian nuclear power program can also be used to construct nuclear weapons.
Much of the recent discussions focuses on Iran and its pursuit of a civilian nuclear power capability, but Mohammed ElBaradei, the IAEA director general, estimates that another 20 to 30 countries possess the capabilities, if not the intent, to pursue the bomb. Meanwhile, the original nuclear weapon states (in particular, Britain, France, Russia, and the United States) continue to modernize their nuclear arsenals, with little effort to relinquish these weapons. All of which leads many to believe that the world is embarking on a second nuclear age.
Fossil-fuel technologies such as coal-burning plants powered the industrial revolution, bringing unparalleled economic prosperity to many parts of the world. But in the 1950s, scientists began measuring year-to-year changes in the carbon-dioxide concentration in the atmosphere that they could relate to fossil-fuel combustion, and they began to see the implications for Earth's temperature and for climate change.
Today, the concentration of carbon dioxide is higher than at any time during the last 650,000 years. These gases warm Earth's continents and oceans by acting like a giant blanket that keeps the sun's heat from leaving the atmosphere, melting ice and triggering a number of ecological changes that cause an increase in global temperature. Even if carbon-dioxide emissions were to cease immediately, the extra gases already added to the atmosphere, which linger for centuries, would continue to raise sea level and change other characteristics of the Earth for hundreds of years.
The most authoritative scientific group on the issue, the Intergovernmental Panel on Climate Change (IPCC), suggests that warming on the order of 2-10 degrees Fahrenheit over the next 100 years is a distinct possibility if the industrialized world doesn't curb its carbon dioxide emissions habit. Effects could include wide-ranging, dramatic changes. One drastic result: a 3- to 34-inch rise in sea level, leading to more coastal erosion, increased flooding during storms, and, in some regions such as the Indus River Delta in Bangladesh and the Mississippi River Delta in the United States, permanent inundation. This sea-level rise will affect coastal cities (New York, Miami, Shanghai, London) the most, compelling major shifts in human settlement patterns.
Inland, the IPCC predicts that another century of temperature increases could place severe stress on forests, alpine regions, and other ecosystems, threaten human health as mosquitoes and other disease-carrying insects and rodents spread lethal viruses and bacteria over larger geographical regions, and harm agricultural efforts by reducing rainfall in many food-producing areas while at the same time increasing flooding in others--any of which could contribute to mass migrations and wars over arable land, water, and other natural resources.
Advances in genetics and biology over the last five decades have inspired a host of new possibilities--both positive and troubling.
With greater understanding of genetic material and of how physiological systems interact, biologists can fight disease better and improve overall human health. Scientists already have begun to develop bioengineered vaccines for common diseases such as dengue fever and certain forms of hepatitis. They are using these tools to develop other innovative medical solutions, including cells that have been bioengineered to serve as physiological "pacemakers." The mapping of the complete human genome in 2001 allows for even greater understanding of human functioning. As a consequence of the Human Genome Project, scientists have already identified more than 1,800 genes associated with particular diseases.
But along with their potential benefits, these technological advances raise the possibility that individuals or non-state actors could create dangerous known or novel pathogens. Additionally, researchers with the best intentions could inadvertently create novel pathogens that could harm humans or other species. For example, in 2001, researchers in Australia reported that they had accidentally created a new, virulent strain of the mousepox virus while attempting to genetically engineer a more effective rodent control method.
Unlike the biological weapons of the last century, these new tools could create a limitless variety of threats, from new types of “nonlethal” agents, to viruses that sterilize their hosts, to others that incapacitate whole systems within an organism. The wide availability of bioengineering knowledge and tools, along with the ease with which individuals can obtain specific fragments of genetic material (some can be ordered through the mail or over the internet), could allow these capabilities to find their way into unspecified hands or even those of backyard hobbyists. Such potential dangers are forcing scientists, institutions, and industry to develop self-governing mechanisms to prevent misuse. But developing a system to ensure the safe use of bioengineering, without impeding beneficial research and development, could pose the greatest international science and security challenge during the next 50 years.