The authoritative guide to ensuring science and technology make life on Earth better, not worse.

After the fact: Another controversial virus study raises questions about US government oversight

By David Gillum, Kathleen M. Vogel | October 28, 2022

Boston UniversityThe National Emerging Infectious Diseases Laboratories at Boston University. Researchers at the center recently conducted a controversial COVID-19 study wherein they created a strain of the omicron variant that proved more deadly to mice than the original omicron. Credit: Z22 via Wikimedia. CC BY-SA 4.0.

In a study published earlier this month, researchers at Boston University sought to find out why the omicron variant of the COVID-19 virus has proven relatively mild. They wanted to know if all the mutations in omicron’s spike protein—the structure that juts from the core of coronaviruses—made the variant less virulent than others. In a highly secure lab known as a BSL-3 facility, the Boston team tested chimeric (hybrid) viruses on experimental mice. While the combination of the spike from the omicron strain and other viral components from an older variant could have been “mild” like omicron, that didn’t prove to be the case. The new virus killed 80 percent of the mice in an infection experiment, while none of the mice subjected to omicron died. After publishing their results in a pre-print study, the researchers, Boston University, and the US National Institutes of Health had a PR crisis on their hands.

The new lab-created virus was in fact less deadly to mice than the original non-omicron strain that made up the chimera’s core; all of the mice sick from that strain died. But many observers framed the experiment as risky or reckless. “Boston University researchers claim to have developed new, more lethal COVID strain in lab,” a Fox News headline declared. Institutional reviewers at the university as well as Boston city health officials vetted the research, but that didn’t stop an outpouring of public concern over the experiment. And it didn’t stop the National Institute of Allergy and Infectious Diseases (NIAID) from pledging to review the team’s work. The uproar was a sign of a larger issue: Government agencies and research institutions alike are struggling to interpret federal guidance as they walk the fine line that differentiates harmless and harmful biological research.

The Boston team didn’t feel the need to bring up the omicron study with NIAID for two reasons, a university statement said. The school said the government didn’t directly fund the experiment, and the experiment didn’t lead to a gain-of-function, whereby researchers create a more dangerous, potential pandemic pathogen. In the current system of biosafety and biosecurity oversight in the United States, institutions are at the frontlines of ensuring they are following dual-use and potential pandemic pathogen rules. US policy on these types of experiments requires institutions receiving federal funding to review research proposals and monitor compliance as a study progresses.

There are federal government policies for oversight of dual-use and potential pandemic pathogen research—experiments that could create knowledge that might pose broad threats to health and security or involve creating potential pandemic threats. The government does vet federally funded proposals on these grounds and requires ongoing monitoring of experiments that could lead to so-called “enhanced potential pandemic pathogens.” But operationally, each institution is in charge of interpreting federal policy, and every institution conducts research safety oversight differently. There is not a standard definition of what it means to be a biosafety officer—an individual who oversees the safe conduct of work with biological hazards—nor are there standardized internal biosafety review procedures. Applying federal policies to the poorly defined and funded fields of “biosafety” and “biosecurity” with uniformity is a huge challenge, hence the after-the-fact investigations and reviews when an experiment like the one in Boston makes headlines.

RELATED:
Global threats don't happen in silos. They shouldn’t be managed separately, either.

Unfortunately, once again, the federal government is scrambling to respond when triggered by a crisis. Every few years, it seems, someone publishes a new scientific experiment that reveals a lack of proper biosecurity and biosafety review, oversight, and approval. And every single time the US government tries to figure out what to do with risky research, without trying to systematically learn the lessons from the past. It’s time for a change.

One key problem: Many of the recent policies related to research on potential pandemic pathogens are merely “guidelines,” they are not laws with enforceable penalties. Accountability is key to any sound regulation. But apart from a few highly regulated areas, such as work with pathogens on the federal select agents list, like the anthrax bacterium, there is no legal liability risk for violating policies in this area of research—say, for example, the guidelines for experiments that could create enhanced potential pandemic pathogens. These are enshrined in funding policies created after researchers a decade ago modified highly pathogenic avian influenza viruses to become transmissible among mammals through the air, something the natural pathogen doesn’t readily do. As the Boston University study makes clear, there’s debate about when these guidelines apply.

The oversight system is also vague because of the lack of a universally agreed definition for the word biosafety. Without a common understanding of what it means, there is likely to be wide latitude in how institutions interpret and implement policies and practices. This could lead to riskier science, with potential threats to public and environmental health.

There is yet another challenge to proper biosafety and biosecurity oversight in the United States: The workforce that’s supposed to oversee these policies being properly implemented is precariously funded and aging.

The generation that implemented the federal select agent regulations 20 years ago—one of the few federal regulations for biological agents that can be used as weapons, and one that spells out what can and can’t be done and what the penalties for violating the law are—are now at the forefront of putting into practice the newer dual-use and potential pandemic pathogen policies meant to mitigate the risks of biological research. The scientific community and federal government need to build up this workforce to replace those retiring from the profession and capture their knowledge before they leave the workforce. There aren’t always shareable records of how to implement and operate dual-use pathogen research programs.

It can, however, be hard for labs to fill biosafety positions, which often are low wage jobs with high turnover. Federally funded grants don’t allow for biosafety personnel to be funded as part of research projects, and research institutions, already grappling with increased regulatory compliance costs, must fund them independently. A lack of resources could take the form of a shortage of properly trained staff, inadequate protective equipment and engineering controls, or staff being unable to provide enough (or correct) information to institutional biosafety committees and biosafety officers that review projects to make decisions regarding risks.

RELATED:
Strengthening democracy and pandemic preparedness go hand in hand

Being part of the biosafety and biosecurity workforce is often a solo mission. There is no existing publicly available database or network of biosafety professionals working on dual-use and potential pandemic pathogen research. Each institution largely needs to implement these policies with their own resident expertise, or try to tediously track down other institutions doing related research that they can contact for advice, with the hope that others will share information about their experiences.

Funding biosafety and biosecurity. In general, there is little transparency about what dual-use pathogen research (involving institutions, funding, researchers, and biosafety personnel) is going on nationally. Because of this, there is a growing need for empirical data to help document, assess, categorize, and manage potentially dangerous biological research.

If the government, public, and philanthropic organizations are interested in the safe and secure future of this research, they should prioritize the funding of studies that examine how biosafety and biosecurity policies have been implemented over the past two decades. This would create a large baseline data set on how various institutions have been putting policies into practice.

Instead of seemingly clarifying rules on the go, after a controversial experiment or grant makes headlines, or holding more listening sessions with a familiar retinue of expert faces, science funders should invest in studying how biosafety and security is implemented. We need an empirical basis to judge how this or that policy—such as the Health and Human Services Department’s process for reviewing gain of function work on potential pandemic pathogens—is interpreted by researchers on the ground and the workforce at funding agencies.

Researchers should study the scientists, biosafety professionals, regulatory compliance personnel, policymakers, and others engaged in dual-use and potential pandemic pathogen work to document their experiences in working with oversight policies and determine best practices for the future. The goal is to improve knowledge and understanding of biosafety and biosecurity in ways that maximize the benefits of research and scientific competitiveness while reducing risks.

The National Institutes of Health’s biosecurity advisory board provided recommendations in 2017 on how to address biosecurity in the life sciences, and while the world had a massive pandemic between then and now, little has changed in oversight of dual-use and potential pandemic pathogen research. As the advisory board considers a 2022 preliminary draft of new recommendations, we are hopeful that they will advance improvements in biosafety and biosecurity funding, operations, research, and oversight that will help keep citizens safe.


Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of
guest

3 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
MJ
MJ
2 years ago

A well-written article Thank you for the insight into our biosafety and biosecurity research among institutions, and lack of oversight in this field. I hope we see change in our lifetime! Please further describe or explain what dual-use research is and why it is important in another article!

peter gutierrez
peter gutierrez
2 years ago

Regarding the Boston University study group article, ‘Role of spike in the pathogenic and antigenic behavior of SARS-CoV-2 BA.1 Omicron’, by Da-Yuan Chen, Devin Kenney, and others, they created their Omi-S chimeric coronavirus to, “test the role of the S protein in Omicron phenotype, we generated a chimeric recombinant virus containing the S gene of Omicron (USA-lh01/2021) in the backbone of an ancestral SARS-CoV-2 isolate (GISAID EPI_ISL_2732373)” This Omi-S chimeric coronavirus appears to be a ‘select agent’ according to the CDC list of such agents. “20. SARS-CoV/SARS-CoV-2 chimeric viruses resulting from any deliberate manipulation of SARS-CoV-2 to incorporate nucleic acids… Read more »

Martin
Martin
1 year ago

That’s a nice analysis. But there is a problem. The research in question did not concern:
“SARS-CoV/SARS-CoV-2 chimeric viruses resulting from any deliberate manipulation of SARS-CoV-2 to incorporate nucleic acids coding for SARS-CoV virulence factors.”

Instead it involved:
“SARS-CoV-2 original variant/ SARS-CoV-2 omicron variant chimeric viruses resulting from any deliberate manipulation of SARS-CoV-2 to incorporate nucleic acids coding for SARS-CoV-2 omicron variant proteins.”

A graphic reads, "Test your global insight from nuclear risks to AI breakthroughs. Take our 10-minute quiz." A globe with connecting points spanning across it appears below it. Behind the globe are sprawling lines connected by circles, symbolizing connection and technology.”

RELATED POSTS

Receive Email
Updates