Death of efforts to regulate autonomous weapons has been greatly exaggerated

By Neil C. Renic | December 18, 2019

robot made of weapons dissolves into dovesImage courtesy of Shutterstock

Arms control has seen better days. In August, the United States formally withdrew from the Intermediate-Range Nuclear Forces Treaty. The Open Skies Treaty will likely soon follow suit. There are doubts as to whether the New Strategic Arms Reduction Treaty will be renewed before it expires in February 2021, as well as concerns over the future of the Nuclear Non-Proliferation Treaty. “The painstakingly constructed arms control regime is fraying,” argued UN Secretary-General, António Guterres, in a recent statement.

The forecast looks equally gloomy for efforts to regulate emerging military technologies, such as lethal autonomous weapons systems, or LAWS. For example the Campaign to Stop Killer Robots has been striving since 2013 to secure a pre-emptive ban on the use of this weaponry, but no prohibition has materialized. Instead, some states have been intensifying their investment in autonomous weapons, in cooperation with the private sector. We seem to be moving ever closer to the use and normalization of this technology in war.

This raises two closely related questions: Has the regulation of LAWS failed?

And if so, where exactly should we assign blame?

Those who say it has failed lay some measure of fault on the UN Convention on Certain Conventional Weapons (CCW).

Autonomous weapons and the CCW. The CCW entered into force in 1983 and was established to restrict or prohibit particular weapons deemed indiscriminate in their effects or “excessively injurious” as the language of the treaty puts it. The Convention covers a wide range of military technologies, including landmines, incendiary weapons, and anti-personnel blinding lasers, among others. The problem of lethal autonomous weapons systems was taken up in 2014, and the CCW established a special working group, known as the Group of Governmental Experts, to debate the matter in 2016.

The CCW operates on a consensus basis, meaning that all decisions must have universal support from member states. In the case of LAWS this format acts an impediment to reform, granting states such as the United Kingdom, the United States, and Russia effective veto power over the formal negotiation of a ban treaty.

Last month, a meeting of the states that are party to the CCW took place in Geneva, Switzerland. On the topic of LAWS, participants finally confirmed the duration of future meetings (10 days split over two meetings in 2020, and between 10-to-20 days in 2021), and recommitted themselves to the goal of developing “aspects of [a] normative and operational framework [for lethal autonomous weapons systems].” Hardly ground-breaking.

These relatively marginal gains have reinforced a growing sense among campaigners that the CCW process has hit a wall. As one DW news service headline put it: “UN impasse could mean killer robots escape regulation.” The head of the Campaign to Stop Killer Robots, Mary Wareham, has gone so far as to ask whether the purpose of the CCW talks is, in fact, “to legitimize the development, production, and use of LAWS.”

With these criticisms in mind, it would seem reasonable to conclude that the CCW process has failed and should be discarded in favor of other avenues for regulation.

Listen up, UN: Soldiers aren’t fans of killer robots

At best, this view is only half-correct. While it may indeed be time to shift the debate over autonomous weapons to a new forum, this does not invalidate what has already been accomplished. The CCW process has been of real value, clarifying the nature and significance of the challenge of autonomous weapons systems, and catalyzing a new and more productive stage of regulation.

The clarifying power of the CCW. In a statement released last month, the nongovernmental organization, Mines Action Canada, criticized the lack of ambition on display within the UN CCW. They accused the body of “wandering aimlessly through the diplomatic woods” regarding LAWS, and stressed that the time has come to “end discussion and take action.”

Concrete action on LAWS, either in the form of a legal framework for appropriate conduct or an outright ban, is clearly overdue. If it is achieved at all, however, it will at least in part be thanks to the debate within the CCW that has proceeded it.

The need for military regulation is rarely self-evident. In the absence of an organized campaign to highlight the moral and legal controversy of a new technology, innovation often proceeds with little to no guidance beyond strategic calculation. To avoid this problem, would-be regulators must clarify the nature, scale, and urgency of the challenge at hand, as well as the humanitarian consequences of inaction.

This is always a challenge, but especially so in the case of LAWS, where regulation is sought pre-emptively: The autonomous weapon systems that most trouble humanitarians have yet to emerge. There is no egregious incident to cite; no cautionary tale to draw upon to make the case for reform. In such circumstances, campaigners must instead rely upon informed prediction to clarify the significance and stakes of the issue.

Has the CCW generated such clarity? Some would argue that it has done the very opposite; that a persistent ambiguity over the technical aspects of LAWS has allowed certain states to evade their responsibilities over its regulation.

There is some truth to this charge. Arguments over the precise meaning of terms such as “full autonomy” and “human control” have undoubtedly complicated the debate on LAWS. But drawing out such complexities has been essential to move this debate forward. We now have a clearer comprehension of the importance of reaching a common understanding over key terminology—and a growing number of projects working to provide it.

We also should not forget that to successfully regulate the battlefield, campaigners must generate not only light but heat. Beyond its contribution to the technical dimensions of this debate, the CCW process has brought a clarity of purpose to the regulation of LAWS

The large and growing campaign to pre-emptively ban the use of autonomous weapons is today unified around a core message: that this technology poses a serious—likely inherent—challenge to the moral and legal standards of war.

And progress is being made. In October, Namibia became the 30th state to support an international treaty to prohibit lethal autonomous weapons systems. There has also been a growing number of calls, in numerous countries, reaffirming the need to maintain meaningful human control over the weapons of war.

Listen up, UN: Soldiers aren’t fans of killer robots

The CCW has also been an important venue through which to publicise this issue to the general public. The benefits of such outreach have been significant; a recent YouGov poll of 10 European countries found that 73 percent of respondents favored an international ban on LAWS.

None of this was inevitable. It isn’t difficult to imagine alternative circumstances where autonomous weapons develop against a backdrop of widespread ambivalence, or even approval. That the spotlight has been so intensely focused on the moral and legal challenges posed by this technology is testament to the efforts of campaigners, and their use of forums such as the CCW to bring much-needed clarity to this issue.

The CCW as catalyst. For those who view autonomous weapons as a threat to our very humanity that requires immediate and categorical prohibition, the CCW process is disappointing. The body operates on a consensus basis, giving resistant states the ability to unilaterally derail suggested reform, a power they have consistently wielded.

But this does not invalidate the CCW. The structural deficiency of this process, and the institutional sclerosis it has generated, has galvanized, rather than disheartened, campaigners.

For proof of this potential, we need look no further than the anti-personnel landmine ban of 1997. It, too, had a troubled history: Despite an effective strategy to draw widespread attention to the civilian cost of this technology, campaigners failed to secure a ban at the 1995 Review Conference of the CCW.

This setback did not end the campaign to ban landmines, but rather energized it. By 1996, the International Campaign to Ban Landmines had attracted the support of 600 nongovernmental organizations, as well as a growing number of states. One year later, the Ottawa Treaty, prohibiting the production, development, and use of anti-personnel landmines, was signed by 122 states.

The regulation of autonomous weapons has the potential to follow the same route. Earlier this year, Mary Wareham acknowledged that the recalcitrance of states such as Russia and the US “is forcing us and I think others, to explore any other avenue.” What form this alternative avenue will take is not yet clear, but there are a few possibilities.

One option is multilateral action within the United Nations General Assembly. A French/German declaration earlier this month to develop “a normative framework” on autonomous weapons received the support of dozens of foreign ministers. Brazil has also offered to host a symposium on LAWS in February 2020. In either case, the CCW will have served as an incubator for reform.

This is a lesson that is worth taking in, not just for LAWS, but for arms control more broadly. Not all failures of regulation are alike. Some are truly as bad as they initially seem, the inevitable by-product of a fundamental and uncompromising unwillingness among relevant parties to restrain the development of a weapon or technique in war.

Other failures, however, barely deserve the name. They are laden with future possibility; and may prove necessary to the eventual securing of reform. It is the latter failure that best characterizes the CCW process on autonomous weapons.

Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of

Inline Feedbacks
View all comments