The authoritative guide to ensuring science and technology make life on Earth better, not worse.

Researchers hacked a lab’s pathogen containment system. Was it a good idea to publish the results?

By George Poste, David Gillum | January 19, 2023

A biosafety level 4 lab.Researchers wearing positive pressure personnel suits at a US National Institute of Allergy and Infectious Diseases biosafety level 4 lab. Credit: National Institute of Allergy and Infectious Diseases.

Negative pressure systems help keep dangerous pathogens contained in laboratories and hospital rooms. Given the potentially dire results should microbes escape one of these containment facilities, researchers at a major US university sought to probe whether negative pressure systems could be hacked. The team not only disrupted such a system at a real laboratory, but the researchers also laid out in great detail how to perform the hack.

While addressing loopholes in lab security is critical, by posting their paper online, the researchers broadcast the study’s potentially dangerous results to anyone with an internet connection. It’s a paper that highlights a worrisome trend in the life sciences. Scientists are conducting dual-use research— research in which knowledge generated with beneficial aims can also serve malicious ends—with little oversight and then freely disseminating their findings. The paper on hacking biological containment controls came in the wake of another publication from Boston University. Researchers there created a hybrid COVID virus that while less deadly to mice than an ancestral version of SARS-CoV-2, was deadlier than the omicron variant. In another controversial case, the National Institutes of Health (NIH) proposed hybridizing the monkeypox virus to see which genes made it more virulent.

For policymakers, the studies underscore a long-standing problem: How to encourage as much scientific and technological progress as possible, giving researchers broad leeway and allowing them to share their results widely, while placing restrictions on research that might be too risky to pursue or to share widely. Dual-use concerns exist in other fields, like physics. But to a seemingly greater degree than in other areas, the life sciences are struggling with a lack of clarity on oversight policies for dual-use research, including those regarding publication. This confusion is occurring against the backdrop of accelerating advances in genetics, synthetic biology and artificial intelligence which mean that an ever-increasing fraction of research in biotechnology poses potential dual-use risks. A terrorist group or adversarial country could try to obtain dual-use research and incorporate it in weapons or use it for some other nefarious purpose, as has happened in other areas such as nuclear weapons and pathogen research in the past

Some prominent journals have developed enhanced review processes for dual-use pathogen research. But protestations about limiting academic enquiry have helped limit past efforts to control how dual-use research is published in journals. While journals may have overly lax standards, these days much research doesn’t even go through these gatekeepers, at least initially. So-called “pre-print” servers allow authors to post papers directly to the internet. Authors can bypass the editorial and peer review process adopted by credentialed scientific journals. This trend is gaining momentum, complicating proactive scrutiny of potential risks.

RELATED:
Strengthening democracy and pandemic preparedness go hand in hand

The rationale offered by researchers who publish information that could be exploited by adversaries is that revealing vulnerabilities will stimulate development of remedial countermeasures. Indeed, the negative air pressure systems researchers included methods to counteract the hack they outlined in their paper. The question is, of course, would the labs, hospitals, and biotechnology companies that operate negative air pressure systems implement the proposed fix before a malicious actor could exploit the hack?

The negative air pressure systems researchers could have chosen not to place their research in the public domain and communicate instead with those with who might be able to implement risk control measures. These include: the manufacturers of the vulnerable control systems; government agencies with laboratory security oversight; and the American Biological Safety Association, an association for biosafety professionals, which could be contacted to distribute knowledge and information about the vulnerability to networks. The cybersecurity community has long adopted a “white hat” practice of sharing discovered software and hardware flaws with the relevant vendor companies to allow design and installation of patches ahead of public disclosure.

The seeming lack of appreciation for dual-use risks of publication echoes a broader confusion over regulations and policies related to dual-use life sciences research.

Other than formal regulation of the most lethal pathogens, oversight of dual-use biotechnology research has been erratic, driven largely by reactive, ad hoc responses to the latest controversial publication. Policy relies heavily on guidelines that are vague, allow for broad interpretation, lack enforceability, and often apply only to federally funded research.

When it comes to the security of biological materials, there are also very few requirements to protect information. The most recognized regulation that addresses security of biologicals is the Federal Select Agent regulations, which relates to access to a very specific list of biological agents and toxins. These regulations do not expressly dictate what someone can or cannot say about biological materials, although it is common practice for those authorized to work with the materials to not share anything related to how to gain access to them.

RELATED:
Strengthening democracy and pandemic preparedness go hand in hand

Under current policies, the Boston study, the negative air pressure systems study, and the NIH’s monkeypox proposal did not meet the criteria for prepublication government review, a process the NIH has for federally-funded researchers working with one or more of 15 agents. The NIH states that one of the goals of the review process is “to ameliorate dual-use concerns before publication review.” Absent a more coherent oversight framework for dual-use research, self-regulation by the academic community has become the default position in decisions about what research to carry out and whether to limit the risk from open publication.

The challenge in managing the risks of dual-use research lies in balancing oversight to prevent accidental harm or adversarial exploitation without slowing scientific progress for public good and national competitiveness in technology innovation. This challenge was met in the Cold War era. Government, academia and industry achieved consensus on how advances in physics, engineering and computing could be partitioned to meet national security needs that never entered the public domain, without hindering the academic and corporate innovation that led to today’s US industrial leadership in aerospace and digital technologies. The time has come for public and private sector stakeholders to bring similar pragmatic wisdom and foresight to the biosecurity challenge.


Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Keywords: COVID-19, biosafety labs
Topics: Biosecurity

Get alerts about this thread
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
An advertisement reads, “Our future is at stake. You can help create solutions by supporting our nonprofit mission to reduce existential threats. Give today.” Behind it, images of wildfires, surveillance cameras, nuclear missile launches, and biosecurity researchers symbolize climate change, nuclear risk, and disruptive technologies.

RELATED POSTS

Receive Email
Updates