The COVID pandemic spurred a revision of US pathogen research rules. Will it help?

A scientist works at a biosafety level 4 lab.A US Centers for Disease Control and Prevention scientist works in a biosafety level 4 lab. Credit: US Centers for Disease Control and Prevention.

With many experts believing or entertaining the unproven possibility that COVID-19 had its origins in a laboratory, the pandemic re-ignited a debate over how the government should oversee federally funded research experiments that modify or create pandemic-capable pathogens or involve especially dangerous viruses and bacteria. In response, the US government issued a long-awaited policy update last month intended to better manage the hazards associated with risky biological research.

Covering work on disease-causing agents, those capable of deleterious impacts on the environment, and eradicated or extinct pathogens like smallpox, the new policy will, depending upon the experiment, subject certain research to one of two different levels of oversight. The first, so-called category 1 research, involves dual use research of concern (DURC)—studies that create results that could be misapplied, say by terrorists or state bioweapons programs. The other type of study, category 2 research, includes experiments with pathogens with enhanced pandemic potential (PEPP). It involves genetic modifications that confer increased pathogenicity or transmissibility to pathogens with the potential to cause a pandemic, a body of work sometimes called gain-of-function research. It would be subject to extra scrutiny by the funding agency.

The new policy, which goes into effect in May of 2025, combines portions of earlier policies and frameworks geared toward the oversight of pandemic pathogen experiments. So, what’s in it and will it help make potentially high-risk research safer?

The good. The policy emphasizes that researchers consider risks from the proposal stage of an experiment through to publication and dissemination of information. Under the old policy on gain-of-function involving pandemic pathogens, the government itself assessed whether to give a study an extra level of review. The new policy seeks to engage researchers and institutions in the process of making determinations over what category of research an experiment involves, a change which, theoretically, could lead to more appropriate oversight. Under the new policy, research institutions must certify compliance when seeking federal funding and maintain ongoing dialogue with federal agencies about their risk mitigation plans if any research falls under category 1 or 2 of the policy.

The new policy provides a comprehensive framework for identifying and managing research involving dual use research of concern and pathogens with enhanced pandemic potential. For example, the old dual use research of concern and enhanced pandemic pathogen policies focused on a limited list of 15 pathogens and toxins and seven experiments of concern. The new policy significantly expands the scope of oversight by substantially broadening the list of pathogens and toxins and the types of experiments that are covered.

Researchers make an initial assessment of which category their study falls in, a determination that’s later confirmed by their institutions and the federal government. With both categories of experiment, the research institution weighs the benefits and risks of the work and develops a risk mitigation plan, which are also reviewed by the funding agency. The policy identifies specific experimental outcomes that can make a study category 1 or category 2, the latter of which would trigger an extra-level of review. The policy retains the requirement for the creation and use of institutional review entities (IREs) that would serve as intermediaries between researchers and the funding agencies, as well as an additional layer of biosecurity oversight.

To aid institutions in following the policy, the update comes with an 84-page implementation guide replete with flowcharts, diagrams, and case studies. For example, one case study involves research on seasonal influenza to develop more effective vaccines using innovative genetic techniques that could present biosecurity concerns. Typically, experiments involving seasonal influenza do not necessitate heightened oversight—falling outside both categories 1 and 2. However, certain experiments might reasonably be expected to endow the virus with characteristics of a pathogen with pandemic potential. An experiment could, for example, enhance the virus’s ability to evade pre-existing immunity from vaccination or natural infection, thereby making the work, at a minimum, a category 2 experiment. Another case study explores an experiment involving the mpox virus that could confer higher morbidity and mortality by increasing the virulence of the agent.

Both examples detail how researchers and institutions could mitigate risks by regularly reviewing experimental findings, developing responsible communication and risk management plans, and restricting information hazards from public release if warranted.

As technology develops, so do new biosecurity concerns.

Another positive aspect of the policy is that it provides voluntary guidance for relevant work with computational models and artificial intelligence (AI). Researchers could, say, develop dual-use software models that predict DNA sequences expected to increase virulence or transmissibility, a feat that could, ultimately, help develop custom AI-derived organisms. According to the new implementation guide, this type of work should involve consideration of how to responsibly share the results of such a study, whether through publication in a journal or through social media. The policy also encourages institutions to look beyond the various lists of agents during research reviews and to include experiments with agents not listed in the policy.

The new policy applies to any US-based and international institutions that receive US government funding for life sciences research.

Gaps. While the policy is commendable in addressing transparency and emerging fields such as artificial intelligence, as well as for expanding the scope of oversight, it falls short in several other areas. These include ambiguous terminology, vague training and communication guidelines, and nebulous compliance mechanisms for international collaboration.

The policy leaves itself open to interpretation in some places.

In discussing research that may result in pathogens with enhanced pandemic potential, the policy uses terminology such as “reasonably anticipated” in describing how to identify experiments for both categories of oversight. It also relies on terms and phrases, such as “non-trivial,” “high confidence,” “highly unlikely,” and “based on current understanding.” These terms are crucial because they help policymakers, researchers, and safety professionals evaluate the likelihood and significance of research outcomes. However, their subjective nature poses challenges because their interpretation can vary among different reviewers and institutions, leading to inconsistencies in policy application. These challenges could be further exacerbated by potential inconsistency in the implementation of the policy by the various federal agencies that fund life sciences research.

This variability can affect the fairness and uniformity of oversight, making it difficult to develop clear, understandable, and actionable guidelines. For example, debates have arisen around whether the Department of Health and Human Services appropriately reviewed certain gain-of-function research projects under the Potential Pandemic Pathogen Care and Oversight (P3CO) framework, one of the two policies slated to be replaced in 2025​​. Critics argue that the scope of that policy was sometimes too narrow, leaving out potentially risky research that should have been reviewed. Examples include an experiment at the University of North Carolina involving the creation of a chimeric SARS virus using a spike protein from a bat coronavirus, research at Boston University that involved inserting the omicron variant’s spike protein into the ancestral strain of SARS-CoV-2 to create a chimeric virus, and an experiment at the National Institutes of Health that involved enhancing the mpox virus to better understand its transmissibility and virulence. The department subjected only three research proposals to extra scrutiny under the policy.

The new policy attempts to address these issues by broadening the scope of covered pathogens and experiments and by providing more detailed guidelines on categorizing different studies.

However, there very well may be cases where the manner and level of oversight of a project depends on judgment calls by researchers, institutions, and regulators. While the policy calls for establishing an appeal process at the institution level when there’s disagreement about the determination of category 1 or category 2 research, there is no such mechanism if there is disagreement with the decisions made at the federal agency level.

The new policy does not adequately consider the extensive training and awareness that will be required by researchers, biosafety officers, research administrators, and federal funding agencies. Institutions must now provide detailed reports on high-risk research activities within narrow timeframes, whereas previously, institutions operated under broader timeframes for reporting and compliance.

Additionally, the policy emphasizes responsible communication of research and its outcomes, which is not always a strength of scientists. Therefore, a strong collaboration will be necessary between an institution’s science communication media team and researchers conducting potential dual use research of concern and research with pathogens with enhanced pandemic potential research.

Without proper preparation, institutions may struggle to comply with the policy’s requirements, leading to lapses in oversight and potential loss of funding opportunities.

Wuhan Institute of Virology
The Wuhan Institute of Virology in Wuhan, China. Credit: Ureem2805 via Wikimedia Commons. CC BY-SA 4.0.

The variability in biosecurity regulations across countries will create inconsistencies in how dual use research of concern and pathogens of enhanced pandemic potential are managed globally. Enforcing compliance in international collaborations remains challenging, requiring cooperation from foreign governments and institutions. As evidenced recently, there are significant ramifications for not being transparent and working in a collaborative nature with regulatory investigations. EcoHealth Alliance, a US-based federally funded nonprofit that worked with the Wuhan Institute of Virology on bat coronavirus experiments, was unable to comply with the National Institutes of Health (NIH)’s demand that it retrieve lab documentation from China.

US biosecurity policies to date have lacked robust mechanisms to enforce compliance abroad, making it difficult to ensure that international partners adhere to the same stringent standards as domestic contributors. For example, the NIH Office of the Inspector General reported that the NIH frequently failed to comply with audit requirements, which are essential for ensuring that grant recipients maintain sound financial practices and internal controls; such failures increase the risk of mismanagement of taxpayer dollars supporting research.

Effective data sharing is essential but may be hindered by political differences, intellectual property concerns, and differing data protection laws. Furthermore, resource disparities between countries will impact the ability to meet the high standards set by the policy, particularly in developing nations.

Another concerning element of the policy is that the biosafety professionals who currently serve in institutions to ensure that experiments are carried out safely and in accordance with laws, regulations, and policies are not mentioned in the oversight process delineated in the policy.

In a clear limitation, the policy’s emphasis on federally funded research also neglects the expanding realm of private research.

At the 2024 Harvard Yale Biosafety Symposium in Boston, Massachusetts, one of us (David Gillum) presented an overview of biosafety and biosecurity oversight of dual use research of concern and pathogens of enhanced pandemic potential research in the United States. The research team found that, in a 2024 survey with 521 respondents, out of the 128 people who said they were engaged in dual use research of concern experiments, 34 worked for private organizations. Notably, out of 64 respondents who reported conducting research on pathogens with enhanced pandemic potential research, 23 worked for private companies.

The newly issued US biosecurity policy represents a significant advance in managing the dual-edged sword of life sciences research.

As scientific research evolves, so, too, must the frameworks that govern it, ensuring that innovation can continue in a manner that is safe, ethical, and beneficial for society at large. The new policy’s success will hinge on effective implementation, continuous updates, and international cooperation. Time will tell if the government’s new policy proves more effective or less controversial than the ones it is replacing.

Editor’s note: David Gillum’s research is funded by the National Institutes of Health’s National Institute of General Medical Sciences. 


Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

RELATED POSTS

Receive Email
Updates