The authoritative guide to ensuring science and technology make life on Earth better, not worse.
By James Revill, Clarissa Rios, Louison Mazeaud | November 16, 2024
The development of biological weapons with reliable, predictable effects has proven difficult in the past. Generating militarily efficient and accurate weapons requires considerable skills and knowledge, and large-scale offensive biological weapons programs involved intensive investments in infrastructure and equipment, along with legions of scientists. Any contemporary program would face similar challenges.
Yet with frequent headlines documenting the rapid evolution of artificial intelligence (AI) technologies, the community of scholars, diplomats, scientists, and others involved in biological arms control has been engaged in an intense debate over how AI will affect their efforts.
The rapid emergence of AI might, for instance, undermine export controls meant to protect potentially weapons-related technologies and information, lower the barrier to weapons development, or assist in the development of new types of weapons. While AI also has positive implications for biological arms control, policy makers and researchers—internationally and nationally—must strive to understand the technology as it evolves in order to implement appropriate policies to guard against its risks.
AI risks. AI and other converging technologies could be applied to produce biological weapons that appear very different from those of the past, which often relied on existing, known agents like anthrax to cause disease or death among targeted populations. This could challenge assumptions about what biological weapons look like and lead to questions about the breadth of coverage of the Biological Weapons Convention, the global treaty banning bioweapons.
For example, biological design tools and other forms of generative AI can be used to design molecules with desired bioactivity, toxicity, and molecular properties. Such tools could be used to develop a new generation of biological weapons with novel characteristics such as the capacity to evade conventional detection methods.
Large Language Models include tools like chatbots that can generate and process large amounts of data. Such models could reduce some of the skill required to perform biological experiments, facilitate the acquisition of biological weapons and related materials, and aid the planning of an attack by a wide range of hostile forces, potentially including non-state actors.
Large Language Models could also play a role in facilitating and accelerating the spread of mis- and dis- information related to any disease outbreak or deliberate biological event; for example, by obscuring the origins of a disease. In 2022, the UN Interregional Crime and Justice Research Institute released a handbook on combating weapons of mass destruction disinformation that highlighted the risk of AI techniques being used to influence opinion by generating fake content and eroding public trust—notably on vaccines,. There’s a risk that, applied to a public health crisis, such AI-generated falsehoods could mislead investigations, interfere with medical countermeasures, or generate panic.
AI and biological arms control. Several factors could influence any decision to pursue an offensive biological weapons program, including the perceived challenges in developing such weapons. Fortunately, there are several measures in place designed to prohibit and prevent biological weapons, including the Biological Weapons Convention.
Under the treaty, 187 countries—the vast majority of the world’s nations—have committed never in any circumstances to “develop, produce, stockpile or otherwise acquire or retain: microbial or other biological agents, or toxins whatever their origin or method of production, of types and in quantities that have no justification for prophylactic, protective or other peaceful purposes.”
The intent-based definition of biological weapons should future-proof the treaty and ensure it remains comprehensive in scope even as technology changes. An AI-augmented biological weapon is still a biological weapon and therefore prohibited under the treaty.
Technological development may nonetheless complicate efforts to implement biological arms control. For example, export controls play a crucial role in preventing the proliferation of biological weapons. They work by regulating the transfer of dual-use materials and equipment and are often implemented through licensing or permit requirements. The measures help ensure that sensitive technologies are used only for peaceful purposes, without restricting access to necessary materials for legitimate civilian uses. However, the integration of AI and biological design tools could complicate the development of effective export control measures to deal with these sorts of intangible technologies—already a difficult proposition given the difficulty of regulating a trade that doesn’t involve a physical product.
Intangible technologies include non-physical knowledge, data, or digital information, such as software code, algorithms, or genetic sequences which can be transferred electronically. Controlling the spread of dual-use digital biological information may be particularly challenging. Nefarious actors could theoretically use open-source, freely accessible biological design tools anywhere. They would not need access to known sensitive information (e.g., a gene sequence involved in pathogenicity) or receive any physical material. They could “discover” dangerous proteins or genetic sequences independently without needing prior access to known, dangerous ones, making it difficult to subject hitherto unknown biological material or information to regulations such as export controls. Although steps have been taken in a number of countries to screen purchases of synthetic nucleic acids, dealing with intangible technology and information hazards around the globe remains difficult and politically challenging.
Developments in AI may also require new national laws or regulations, and ways of engaging scientists and industry.
Under the Biological Weapons Convention, parties to the treaty are obliged to take any necessary measures to “prohibit and prevent” biological weapons at the national level. Much has been done over the last decade in terms of national implementation of the convention and the wider development of biosecurity measures. Many countries have developed national legislation specifically covering treaty obligations as well as detailed safety and security measures in laboratories.
However, the measures required to prohibit and prevent biological weapons will change over time, particularly as biotechnologies and research and development practices continue to evolve. Laws may need to be amended to include provisions covering access to what are known as “cloud labs,” for instance. (Cloud labs are remote, automated laboratories accessed via cloud-based platforms, allowing researchers to run and monitor experiments remotely using robotic systems. They offer flexibility, cost-efficiency, and scalability without the need for physical presence in a traditional lab.)
These labs could be used to produce dangerous pathogens. To ensure national measures remain able to prohibit and prevent biological weapons, countries will need to ensure that legislation, regulation, and wider risk mitigation measures take in the convergence of biotechnologies with AI developments such as cloud labs.
Additionally, the convergence of biotechnologies with other emerging technologies such as cyber (e.g., biotech espionage, cloud lab sabotage or hijacking, unauthorized access to genetic databases and synthetic biology tools, vaccine infrastructure sabotage during a pandemic, manipulation of biological data, cyberattacks on health systems to spread pathogens, and tampering with genetic data to alter public perception) also calls for new measures to address cyber-biosecurity concerns and prevent illegal digital access to databases and research infrastructure and public health facilities.
Countries will also need to engage the life science community to ensure that biosafety and biosecurity form part of the research ecosystem. Further industry engagement will be particularly important. To do so, countries and stakeholders in research fields and industry must collaborate more closely. To this end, there could also be value in starting voluntary peer review initiatives among industry actors to cooperatively understand one another’s institutional safety and security procedures—and take stock of existing approaches to secure AI and biotechnology.
Can AI help? AI presents not only risks but also several possible opportunities to strengthen biological arms control. At the moment, the Biological Weapons Convention lacks a mechanism for verifying compliance—unlike regimes dealing with chemical and nuclear weapons, A working group is already seeking to address this deficit, and develop processes to better evaluate whether a state is compliant with the treaty. AI or other new capabilities could contribute to this effort.
AI could also play a role in facilitating responses to disease outbreaks. For example, AI can help evaluate large datasets to identify potential vaccine targets and forecast the efficacy of vaccine candidates. AI can also play a role in supporting disease surveillance and early warning as well as helping to expedite outbreak responses. Such steps can contribute to the provision of cross-border assistance in the event of a violation of the convention and thereby dampen the effects of biological weapons.
In 2022, the countries in the Biological Weapons Convention agreed to begin developing a process for reviewing scientific and technical developments, a critical need to keep the treaty up-to-date in an era of rapid scientific and technical change., a systematic means of assessing the risks and benefits of developments in the life sciences and converging technologies, including AI but also quantum computing and advanced cyber capabilities, is important. A science and technology review mechanism could address some of the uncertainties surrounding new technologies and develop recommendations to mitigate the risks and collectively exploit the benefits of advances in technology in an equitable manner.
Furthermore, an important aspect of the Biological Weapons Convention is the obligation of states to facilitate international cooperation so that the world can benefit from the positive application of developments in the life sciences. The working group is trying to agree on a mechanism for boosting international cooperation and assistance. This future mechanism should incorporate an understanding of the benefits of AI and how it can contribute to enhancing peaceful bioscience collaboration.
In 2025, the Biological Weapons Convention will be 50 years old. Looking ahead to the next 50 years, treaty members will need to consider how the Convention and the norm against the hostile use of biology can be sustained for the next half-century and beyond. The advancement, convergence, and diffusion of new technologies is already presenting considerable challenges to the governance of dual-use life science research. It is unlikely this challenge will get easier as biotechnology (and other powerful technologies) gathers further momentum. Informed decision making within the Convention and the wider biosecurity regime now can help avert a darker biological weapons future, something in the interest of all humanity.
Editor’s note: The views expressed herein are those of the author and do not necessarily reflect the views of the United Nations.
The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.
Keywords: AI, Biological Weapons Convention
Topics: Biosecurity