The nuclear age dawned with the creation of the first atomic bombs dropped by the United States on Hiroshima and Nagasaki in August 1945 at the end of World War II. During the Cold War years of 1949 to 1990, hostility between the United States and the Soviet Union defined the nuclear threat. Each superpower was poised to destroy the other with nuclear arsenals that together at their peak exceeded 70,000 bombs. The possibility of all-out nuclear war–a war that no one could win and that could lead to the end of modern civilization–was ever present.
The US and Soviet nuclear arsenals were by far the largest, but Britain, China, and France also established nuclear weapon programs during the 1950s. Later came Israel, India, Pakistan, and North Korea. In contrast, other countries, including Brazil, Argentina, South Africa, and Sweden, initiated nuclear weapons programs but later decided to shut them down.
Today, the mind-numbing possibility of nuclear annihilation as a result of a deliberate attack on the other by the United States or Russia seems a thing of the past, yet the potential for an accidental, unauthorized, or inadvertent nuclear exchange between the United States and Russia remains, with both countries anachronistically maintaining more than 800 warheads on high alert, ready to launch within tens of minutes.
In addition to creating their enormous nuclear arsenals, the United States and the Soviet Union spread civilian nuclear power technology and research reactors, the peaceful uses of nuclear energy, to more than 40 countries during the decades of superpower competition. The result is that the materials used to construct nuclear bombs can be found in some 144 sites around the world according to the International Panel on Fissile Materials. Save for Antarctica, every continent contains at least one country with civilian highly enriched uranium. Even with the improvement of nuclear reactor design and international controls provided by the International Atomic Energy Agency (IAEA), the presence of bomb-making materials in so many places increases the chances that terrorist groups could get hold of enough highly enriched uranium or plutonium to use in a bomb.
While international attention focuses today on North Korea’s small number of nuclear weapons and on Iran’s pursuit of a civilian nuclear power capability, with the possibility that it could create nuclear weapons as well, the IAEA estimates that another 20 to 30 countries possess the capabilities, if not the intent, to pursue the bomb.
Meanwhile, the original nuclear weapon states (in particular, Britain, France, Russia, and the United States) continue to modernize their nuclear arsenals, with seemingly little effort to relinquish these weapons. These trends lead many to believe that key governments are not yet intending to pursue the goal of a world free of nuclear weapons. As long as nuclear weapons are considered a legitimate way to provide for national security, all of humanity remains at risk from the most dangerous technology on Earth.
Fossil-fuel technologies such as coal-burning plants powered the industrial revolution, bringing unparalleled economic prosperity to many parts of the world. But in the 1950s, scientists began measuring year-to-year changes in the carbon-dioxide concentration in the atmosphere that they could relate to fossil-fuel combustion, and they began to see the implications for Earth’s temperature and for climate change.
Today, the concentration of carbon dioxide is higher than at any time during the last 650,000 years. These gases warm Earth’s continents and oceans by acting like a giant blanket that keeps the sun’s heat from leaving the atmosphere, melting ice and triggering a number of ecological changes that cause an increase in global temperature. Even if carbon-dioxide emissions were to cease immediately, the extra gases already added to the atmosphere, which linger for centuries, would continue to raise sea level and change other characteristics of the Earth for hundreds of years.
The most authoritative scientific group on the issue, the Intergovernmental Panel on Climate Change (IPCC), suggests that warming on the order of 2-10 degrees Fahrenheit over the next 100 years is a distinct possibility if the industrialized world doesn’t curb its carbon dioxide emissions habit. Effects could include wide-ranging, drastic changes. One such result: a 3- to 34-inch rise in sea level, leading to more coastal erosion, increased flooding during storms, and, in some regions such as the Indus River Delta in Bangladesh and the Mississippi River Delta in the United States, permanent inundation. This sea-level rise will affect coastal cities (New York, Miami, Shanghai, London) the most, compelling major shifts in human settlement patterns.
Inland, the IPCC predicts that another century of temperature increases could place severe stress on forests, alpine regions, and other ecosystems, threaten human health as disease-carrying insects and rodents spread lethal viruses and bacteria over larger geographical regions, and harm agricultural regions by reducing rainfall in many food producing areas while at the same time increasing flooding in others–any of which could contribute to mass migrations and wars over arable land, water, and other natural resources.
Extreme weather, such as long-lasting droughts, outsized storm systems, and increasingly erratic monsoon seasons –is already reducing agricultural yields, causing fresh water sources to dry up, and leading to increased flooding of coastal cities around the world. While these are the kinds of effects from global warming that environmental scientists have been predicting, government policies have yet to encourage the changes in energy use and human settlement that could stave off the very worst results and mitigate the suffering that is now bound to occur.
Advances in genetics and biology over the last five decades have inspired a host of new possibilities–both positive and troubling. Over the past 10 years, in particular, there has been an exponential growth in new biology-based technologies, aided by integration of information and computerized software that allows rapid replication.
With greater understanding of genetic material and of how systems interact, biologists can fight disease better and improve overall human health. Scientists already have begun to develop bioengineered vaccines for common diseases such as dengue fever and certain forms of hepatitis. They are using these tools to develop other innovative medical solutions, including cells that have been bioengineered to serve as physiological “pacemakers.” The mapping of the complete human genome in 2001 allows for even greater understanding of human functioning.
But along with their potential benefits, these technological advances raise the possibility that individuals or organizations could create new pathogens. Researchers with the best intentions could inadvertently create novel pathogens that could harm humans or other species. For example, in 2001, researchers in Australia reported that they had accidentally created a new, virulent strain of the mousepox virus while attempting to genetically engineer a more effective rodent control method. In 2011, scientists evolved a supervirulent strain of the virus H5N1 as part of an effort to discover a vaccine against the virus.
Unlike the biological weapons of the last century, these new tools could create a limitless variety of threats, from new types of “nonlethal” agents, to viruses that sterilize their hosts, to others that incapacitate whole systems within an organism. The wide availability of bioengineering knowledge and tools, along with the ease with which individuals can obtain specific fragments of genetic material (some can be ordered through the mail or over the internet), could allow these capabilities to find their way into the hands of groups bent on violent disruption. Such potential dangers are causing scientists, research institutions, and industry to put in place self-governing mechanisms to prevent misuse. But developing a robust universal system to ensure the safe use of bioengineering, without impeding beneficial research and development, could pose the greatest international science and security challenge in the early 21st century.
In addition to rapid developments in biological sciences, the application of cyber technology to industrial operations, advanced manufacturing, and miniaturization and replication of systems at the atomic and molecular level, present extraordinary opportunities for curing disease and engineering new products to enhance human welfare. Greater understanding of human neurosystems also holds the promise of increasing cognitive capacity and resisting the ravages of genetic and dementia-related mental illnesses.
Yet, there are few governing systems in place to control the uses of these new inventions. We know very well that technologies have benign uses, but they can also be dangerous if used for malicious purposes. Whether by governments or non-state actors, technologies can be unleashed on societies causing grave and irreversible harm. And even with the best intentions, deploying technological solutions, say in geoengineering to combat climate change, may lead to unintended consequences with devastating effects.
Furthermore, some of these emerging technologies are being developed for military applications that may increase the effectiveness of military operations, the accuracy of weapons in combat, and the control of weapons systems. But such knowledge cannot be kept secret. By utilizing powerful new technologies, militaries may create new methods of killing and subduing populations that could come back to haunt us.
As President John F. Kennedy said about the discovery of nuclear fission, “our progress in the use of science is great, but our progress in ordering our relations small.” The challenge remains whether societies can develop and apply powerful technologies for our welfare without also bringing about our own destruction through misapplication, madness, or accident.