Landmines, cluster munitions, incendiary weapons, blinding lasers, exploding bullets, and much more: The list of weapons banned or regulated by international humanitarian law has grown steadily over the past 150 years. If an international campaign of civil society organizations—supported by about two dozen countries and growing—is successful, there could soon be another to add: autonomous weapons.
Given the unprecedented risks autonomous weapons pose, and the strength of the movement against them, a new treaty regulating such weapons is both desirable and viable. Whether that treaty is effective, however, will depend primarily on whether the United States decides to engage in negotiating it and convinces other militarily important countries to do the same.
Not yet deployed. Autonomous weapons, or “killer robots,” as their opponents and the media often call them, are weapons that select and attack targets without direct human control. Think of a drone scanning the battlefield and using artificial intelligence to identify and fire upon a suspected enemy combatant, without waiting for a human operator to approve the strike.
The exact definition of a lethal autonomous weapon is hotly contested. While critics also express concern about non-lethal, anti-materiel, or semi-autonomous weapons, for now international talks have focused only on fully autonomous, lethal anti-personnel weapons. Under this broad definition, no military has deployed such weapons yet, but the technology to do so already exists and is developing rapidly.
To address the humanitarian risks of autonomous weapons, about 100 countries have been discussing the possibility of negotiating a new treaty within the Convention on Certain Conventional Weapons (CCW), a little-appreciated, United Nations-affiliated forum for regulating inhumane weapons. Since 2014, the slow-moving CCW has agreed to renew talks on the issue without being able to reach the consensus the convention requires to actually start negotiating a treaty.
Too soon to regulate? One of the driving forces behind these discussions is an international movement of groups and activists opposed to the unrestricted use of autonomous weapons. Chief among these are the ubiquitous International Committee of the Red Cross and the more militant Campaign to Stop Killer Robots, a coalition of nongovernmental organizations, including Human Rights Watch, that have been active in earlier campaigns to ban landmines and cluster munitions. So far, the campaign has managed to convince about two dozen countries—including Austria, Brazil, and Mexico—to support a preemptive ban on the development and deployment of lethal autonomous weapons. Several more countries, like Germany and France, support a political declaration, but not a legally binding treaty.
The Campaign to Stop Killer Robots and other critics charge that autonomous weapons are immoral and dangerous because they lack the human traits (like mercy) needed for moral decision making, as well as the ability to distinguish between civilians and combatants and to judge the proportionate use of force, two key principles of international humanitarian law. The critics argue convincingly that if the development of autonomous weapons is left unregulated it could lead to a destabilizing arms race. This threat would be made worse by the difficulty in determining who is responsible for the actions of an autonomous weapon, meaning a small incident could spark an international crisis. As with drones, autonomous weapons could make it easier for countries to start unnecessary wars by keeping soldiers off the battlefield, offering the illusion of “risk-free” military intervention but providing no protections for civilians.
The United States, Russia, Israel, and a few other countries oppose either a new treaty or a political declaration. These countries are investing heavily in robots and artificial intelligence. They argue it is too soon to know how autonomous weapons might be used in the future and therefore too soon to know how best to regulate them, if at all. The United States has stated that autonomous weapons could even improve compliance with international law by being better than humans at identifying civilians and judging how to use force proportionately.
Prospects for a standalone treaty. Unhappy with the lack of progress in the CCW, the Campaign to Stop Killer Robots is increasingly urging countries to consider bypassing the convention entirely to negotiate a separate treaty, stating, “If the CCW cannot produce a credible outcome [at its annual meeting on November 15], alternative pathways must be pursued to avoid a future of autonomous warfare and violence.” Unfortunately, such a decision, while understandable and feasible, would be unlikely to produce a truly effective treaty.
One might ask what chance nongovernmental organizations like Human Rights Watch have for achieving a standalone treaty against the opposition of some of the world’s most powerful militaries. Plenty, actually.
By the 1990s, the widespread and indiscriminate use of landmines had become a humanitarian disaster, and the members of the CCW tried to solve the crisis by strengthening an existing CCW treaty regulating this weapon. Frustrated by the perceived weakness of the CCW agreement, the International Campaign to Ban Landmines pushed for a new treaty, under the auspices of the Canadian government, that would ban all landmines without requiring the burdensome consensus decision-making of the CCW. The resulting Mine Ban Treaty mostly ended the large-scale use of landmines outside of a few conflict zones and earned the campaign a Nobel Peace Prize.
In 2008, a similar coalition of nongovernmental organizations repeated this feat, successfully pushing for a Convention on Cluster Munitions outlawing this once-ubiquitous weapon, after years of talks in the CCW had produced only modest results. Even though the United States, Russia, and other major military powers have not joined either treaty, the treaties have created a powerful stigma against landmines and cluster munitions.
Given this history of success, it is tempting to conclude that a strong, standalone treaty is the best way to deal with the threat posed by autonomous weapons, despite the fact that countries like the United States and Russia would almost certainly refuse to join. Autonomous weapons, however, are not landmines or cluster munitions. Landmines and cluster munitions were used around the world for decades in conflicts large and small, in many cases causing great civilian harm. Treaties banning these weapons have value even when the United States, Russia, China, and other major military powers do not participate. In contrast, autonomous weapons are a developing technology likely to be used by only the most advanced militaries for some time. A treaty that excludes almost all the countries with the interest and ability to deploy autonomous weapons would have comparatively little value either as arms control or as a humanitarian norm builder.
At a time when even the taboos against chemical and nuclear weapons appear to be waning, it is hard to imagine that Russia, for example, would consider its autonomous weapons program constrained by the perceived stigma created by a treaty it had no hand in making. A more modest treaty, negotiated in the CCW with the agreement of the world’s major military powers, offers the best chance of providing meaningful restrictions on autonomous weapons in the foreseeable future.
A US policy solution. What could the United States do to achieve such a treaty? The CCW treaty on blinding laser weapons may offer a guide. While blinding lasers and autonomous weapons differ in terms of their military utility and humanitarian threat, both weapons became the subject of campaigns to ban them before they were ever deployed. Opponents of autonomous weapons point to this analogy as proof that a weapon can be banned preemptively, but it also shows how the United States can use a national policy to help reach a difficult international compromise. The United States had long resisted any attempts to regulate the use of lasers to cause blindness, worried that any such regulation could interfere with unrelated military uses of lasers. Then in 1995, as CCW negotiations were underway, the Defense Department adopted a limited national ban on blinding laser weapons. By using this new policy as a basis for negotiations, the United States was able to broker an agreement in the CCW that satisfied countries that wanted a broader ban, countries that opposed any ban, and the requirements of the US military. In doing so, the United States was able to make sure the treaty did not restrict other, less controversial uses of lasers—a concern that is highly relevant to autonomous weapons as well.
In fact, the United States already has a national policy that could serve as the basis for a new CCW treaty. In 2012, the Department of Defense issued a directive requiring “appropriate levels of human judgment over the use of force,” thereby becoming the first country to publicly adopt a national policy on autonomous weapons. The Pentagon even tasked a committee of ethicists, scientists, and other experts with creating an ethical framework for artificial intelligence—their just-released report endorses strong principles of responsibility, traceability, and more .
Clearly, the US government shares some of the activists’ concerns over the ethics of autonomous weapons and is comfortable with some limitations on their use. If the United States can strengthen its existing national restriction on autonomous weapons, it would be well placed to negotiate a new treaty in the CCW. While there is no guarantee that Russia and other countries would agree to start negotiations, US support would increase the pressure on them considerably.
“Killer robots” will soon no longer be confined to the realm of science fiction. To address the new risks autonomous weapons will bring, the world needs a new and effective treaty regulating them. The best chance to achieve such a treaty is for the United States to drop its opposition and take an active role in negotiating a new agreement in the existing forum for regulating inhumane weapons.
The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.