By Moritz Kütt, November 5, 2015
In September, the German automaker Volkswagen admitted that it had secretly programmed the computers in its diesel-engine cars to cheat mandatory pollution emission tests. Volkswagen got away with this legerdemain for years because, like most carmakers, it uses proprietary software to control its engines and other systems, software that still is not available to vehicle inspectors or independent analysts.
Those concerned about international security should pay attention to the Volkswagen case, since arms-control verification relies on similarly complex combinations of software and hardware, and verification tools often use proprietary or export-controlled technology that prevents transparency and independent scrutiny. Without such transparency, it is very difficult to exclude future incidents of cheating. As long as nations take the trouble and risks to construct clandestine facilities for weapon production, what would stop them from manipulating verification devices?
The range of verification tools is large, from specific codes used for modeling atmospheric transport or nuclear reactors to general software for communication and signal processing; from measurement devices, such as gamma spectrometers and neutron detectors, to common computer systems and cameras. The tools are used by both individual states and international organizations.
Consider, for example, the information barrier, a device that combines software and hardware and has been proposed for critical arms-control verification in the future. A typical application would be warhead authentication, where it might be used to filter the gamma-ray spectrum of a warhead and transform the data into a simple binary signal (e.g., a red or green light) to show whether the object is a warhead or not. In another application, an information barrier might be used to detect key pathogens to identify biological weapons. But how useful is this device if some parties don’t trust it? While observing a demonstration of a US-developed information barrier in the early 2000s, a Russian scientist reportedly quipped that all he saw was a green LED connected to a battery.
Consider a second example: In the recent agreement on Iran’s nuclear program, the parties agreed to modify the Arak research reactor to reduce the amount of plutonium possibly produced. Such modifications will require extensive computer simulations of any new reactor design, which must be approved by all parties to the agreement before it can be built. One of the most common tools used for such modeling is a code called MCNP, developed by US weapons laboratories. It cannot be exported to Iran under US export-control laws. There will be no way to independently check all the calculations for the new Arak design unless all parties can use the same computer codes.
So how can arms-control efforts overcome such secrecy and mistrust? One positive example is the international monitoring system used by the Comprehensive Nuclear Test Ban Treaty Organization. The organization has tried to lessen dependency on proprietary technology by replacing it with open source alternatives or in-house developments.
Scientists, engineers, and the arms-control community at large should make similar efforts to develop new tools based on open source norms, specifically as regards two key criteria. First, there should be no restrictions for access to programs or equipment used for verification purposes, and second, access must include software source codes and hardware designs. These criteria have been derived from the ideas of two communities that have shaped open source software development, the Free Software Foundation, founded in 1985 by Richard Stallman, and the Open Source Initiative founded in 1998.
An open source push would increase the trust of parties to arms-control agreements, and while that would not eliminate the possibility of cheating, it would make cheating considerably easier to detect. It would also attract more scrutiny from more places, from hackers to the general public. As Eric S. Raymond, one of the founders of the Open Source Initiative, famously said, “given enough eyeballs, all bugs are shallow.” Put simply, with more people looking, the chances are greater of detecting and fixing both malicious malfunctions and innocent flaws.
Some critics say that source code availability leaves software vulnerable to external cheats. However, any software, proprietary or open, will have weaknesses; more and independent analysis is by far the best way to find them.
Reaching out beyond arms-control experts to bring in members of the hacker community would add many skilled computer experts with interests in technical challenges. Applying open source criteria to arms-control tools could motivate the hackers to apply their skills to this field, as the community in general has a culture of openness.
Opening up verification tools to the larger public could bring together verification by arms-control experts and the world of “societal verification.” Such crowd-sourcing would allow concerned citizens to use everyday technologies to contribute to the arms-control process. Of course, providing open access to verification tools would have some costs in certain areas, such as export controls and the sale of proprietary software. Yet the benefits of a more democratic arms-control process, one that more parties trust and more everyday people care about improving, seem worth this cost.
The Bulletin elevates expert voices above the noise. But as an independent, nonprofit media organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.