The authoritative guide to ensuring science and technology make life on Earth better, not worse.

How to make biomedical research (and biosafety labs) less dangerous and more ethical, post-COVID-19

By Laura H. Kahn | June 8, 2021

A biosafety level 4 lab.Researchers wearing positive pressure personnel suits at a US National Institute of Allergy and Infectious Diseases biosafety level 4 lab. Credit: National Institute of Allergy and Infectious Diseases.

Our luck has run out. The worst pandemic in a century has killed over 3.7 million people globally. In the United States, almost 600,000 have lost their lives to COVID-19. Societies around the world have been, and many are continuing to be, devastated.

The debate regarding the origins of the virus continues with growing circumstantial evidence that the virus leaked from a laboratory. Knowing the origins of SARS-CoV-2 is important if we want to prevent this catastrophe from happening again.

We can state with certainty that human activities including deforestation, wildlife trade and consumption, and intensive animal agriculture increase the risk of deadly pandemics.

Preventing the emergence of naturally occurring zoonotic diseases requires a One Health approach that integrates human, animal, plant, environmental, and ecosystem health. I’ve written extensively about why a One Health approach is important in previous columns.

But to what extent do gain-of-function research, lax biosafety and -security oversight, and minimal bioethics reviews of basic science research contribute to pandemic risk?

Let’s, for arguments sake, assume that the pandemic originated from a laboratory-acquired infection. Pointing fingers or placing blame is not helpful. No laboratory is infallible in regard to accidents, especially those working with active bioagents like viruses. Laboratory spillover events have happened in the past. And they will continue to happen in the future. The question is, how can we reduce the risks?

Gain-of-function research. In 2004, the National Academies of Sciences, Engineering, and Medicine published a report, “Biotechnology Research in an Age of Terrorism,” that listed seven “experiments of concern” that should not be done. These experiments of concern include:

  • Demonstrating how to make a vaccine ineffective
  • Conferring resistance to antibiotics or antiviral agents
  • Enhancing a pathogen’s virulence or make a non-virulent microbe virulent
  • Increasing the transmissibility of a pathogen
  • Altering the host range of a pathogen
  • Enabling a pathogen’s ability to evade diagnostic or detection modalities
  • Enabling the weaponization of a biological agent or toxin.

In response to the report, the National Institutes of Health (NIH) created the National Science Advisory Board for Biosecurity (NSABB) but allowed controversial gain-of-function research to continue.

Gain-of-function research increases a pathogen’s ability to cause disease. The argument in support of this work is that it helps to assess the pandemic potential of infectious agents and assists government officials in developing public health response measures. The argument against is that the work is inherently risky and clearly meets the criteria of the National Academies’ seven experiments of concern.

A series of laboratory accidents at the US Centers for Disease Control and Prevention (CDC) prompted the NIH in 2014 to stop funding gain-of-function research involving pandemic-potential viruses such as influenza and coronaviruses (i.e. SARS and MERS). But in January 2017, the moratorium was lifted after the NSABB concluded that the experiments posed little risk to public safety. Of note, many of the board members were “very experienced, very actively involved in research.” In other words, they had conflicts of interest in overseeing and approving this research.

NIH funding for EcoHealth Alliance’s research, “Understanding the Risk of Bat Coronavirus Emergence,” very clearly describes gain-of-function research that creates novel coronavirus genomes and uses them to experimentally infect across a range of cell cultures from various animal species to humanized mice. The research project was budgeted from June 2014 to May 2019 and according to this article in Nature Medicine was performed in Biosafety Level 3 (BSL 3) facilities.

RELATED:
Global threats don't happen in silos. They shouldn’t be managed separately, either.

Some history on biosafety and biosecurity. Concerns about health risks from new recombinant DNA technologies prompted scientists to meet in 1975 at the Asilomar Conference Center in Pacific Grove, California. They had voluntarily stopped certain experiments until they could be sure that risks to public health were minimal. One of the outcomes from the Asilomar conference, as it came to be known, was the creation of safety guidelines of varying levels according to the degree of risk research involved. Biosafety level one (BSL 1) research represented minimal risk and could be done on an open bench, whereas biosafety level four (BSL 4) constituted the highest risk and required airlocks and space suits. Biosafety levels two and three required increasing levels of specialized equipment and facilities, respectively.

In response to the Asilomar conference, the field of biosafety was born in 1984 with the establishment of the American Biological Safety Association and the creation of advisory documents, Biosafety in Microbiological and Biomedical Laboratories, promoting best practices. In addition, Institutional Biosafety Committees (IBCs) were created to oversee the biosafety of recombinant DNA research at all institutions receiving NIH funding. The biosafety committee members must include scientists, laboratory personnel, and two community members not affiliated with the institutions. These members meet monthly to review research protocols and to decide which biosafety levels they should be conducted in.

There is a weakness in this biosafety regime, however.  There are no surveillance systems of laboratory-acquired infections and, if they occur, there are no mandatory mechanisms in place to notify state and local health officials about those infections. Laboratory-acquired infections are not “notifiable” diseases under CDC guidelines, so they don’t get reported to local and state health officials. The CDC does not collect data on laboratory-acquired infections because they are considered occupational exposures. (https://www.cdc.gov/surveillancepractice/data.html) The National Institute of Occupational Safety and Health (NIOSH) focuses on workplace-related injuries and illnesses including pesticides and chemical exposures, healthcare worker injuries, and blood lead levels surveillance—but not laboratory-acquired infections.

In other words, laboratory-acquired infections fall through the cracks in government surveillance systems.

Because of concerns about biosecurity, the types of microbes that research labs are working on are not shared with state and local health officials. In essence, health officials are in the dark when it comes to potential laboratory-acquired infections in their jurisdictions. This lack of awareness hinders public health preparedness efforts in cases of high-risk laboratory-acquired infections with potential to spread in the community.

Laboratory accidents and laboratory-acquired infections occur, but without good surveillance systems at regional, national, and international levels, it’s difficult to know the extent or severity of the problem until a disaster strikes. Of note, biosafety concerns existed at the Wuhan Institute of Virology long before the emergence of SARS-CoV-2.

Bioethics. Because of the long history of unethical medical research—from the Nazi experiments in Germany during WWII to the US Tuskegee syphilis experiments from 1932 to 1972—Congress passed the National Research Act in July 1974 to establish a national commission to identify doctrine underlying all human-subject research. The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research issued the Belmont Report which identified basic ethical principles underlying all human subject research. The field of medical ethics was born.

RELATED:
Strengthening democracy and pandemic preparedness go hand in hand

There is no field of biomedical research ethics. Aside from the scientists themselves, who have obvious conflicts of interest, there are no experts who can assess whether or not biomedical research proposals meet ethical standards. Just because an experiment can be done doesn’t mean it should be done. Until COVID-19, society had largely abdicated research funding decision-making to the scientific community with little, if any, oversight or input by humanists, ethicists, or public health professionals who might not share the scientists’ views. And the fact that the scientists made the mere suggestion that SARS-CoV-2 might have had a laboratory origin a taboo subject demonstrates that they are incapable of policing themselves.

Next steps. As Winston Churchill (and many others) have been credited with saying, “Never let a good crisis go to waste.” We have an opportunity to make biomedical research safer and more ethical.

Laboratory-acquired infections need to become notifiable diseases to local and state health departments. If ill, researchers and other laboratory workers must notify health care professionals that they work on bioagents in biomedical clinical or research facilities—and federal regulations should be changed to require such reporting. Healthcare professionals should notify local public health officials who would report to the state and ultimately to the CDC working in concert with the National Institute of Occupational Safety and Health. Similar surveillance systems should be established in all countries with biomedical clinical and research facilities.

At the international level, the World Health Organization should work with the Biological Weapons Convention (BWC) Implementation Support Unit to create a laboratory-acquired infection surveillance system based on data collected and reported at the national levels. Biomedical research is inherently dual use; it can provide society great benefits, but it can also be used for ill. The collaboration between WHO and the BWC would send the message that the international community takes these issues seriously.

The field of biomedical research ethics needs to be created. As with human subject research, Congress should establish a commission to identify doctrine underlying all biomedical research. The National Academies’ “Seven Experiments of Concern” should serve as a framework for the commission’s work.

We don’t know if gain-of-function research caused this pandemic or if it was naturally occurring. But the arguments for gain-of-function research can be countered by two points. First, the mRNA vaccines that were developed so quickly in response to the pandemic took decades of prior research and required the sequence of genetic material from the virus’s spike protein. Gain-of-function research was not needed. Second, investments in public health would improve response capabilities much more than any information that gain-of-function research could provide.

With diminishing public trust in science, biomedical scientists should be incentivized to rebuild society’s willingness to support their research. Hubris and a willingness to push the scientific envelope for fame and glory should be replaced with humility and a respect for nature. We don’t need scientists helping nature to make deadlier pathogens in a misguided effort to improve public health. Transparency, public communication and outreach, laboratory-acquired infection surveillance, public health partnerships, and institutionalized ethics would go a long way toward regaining the public’s trust.


Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of
guest

3 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
bBrian Whit
bBrian Whit
3 years ago

[What a joke of a “comments section”. Your moderators dont allow comments. Only sifted, carefully edited, thoughtful prose. I cannily imagine this a stage for elite school grads, hopeful for a sinecure in a think tank, or legacy folks hopeful greater things. To have a thought, like ;wow, bureaucracy in any nation makes level 4 bolas problematic’. Just not worthy? ]

Edward Hammond
3 years ago

I’ll add a few things. First and in general, I would rely on law rather than ethics. I lost faith in bioethics as a young man when I sat at the dinner table with the representives of a large European agrochemical company and a Canadian bioethicist whose liturgies were favored by that industry. The two of them compared notes on their Rolexes for most of the meal. Bioethicists are for sale and and can and will be captured. Most of the prominent bioethicists working on gene drives, for example, are being paid by publicly-funded projects with a explicit intent of… Read more »

Charles Forsberg
Charles Forsberg
3 years ago

There are two problems with lab leaks with air-borne viruses. First, in many cases may not know if a leak occurred. Second, if locate Level 4 facility in a major city and the wrong air-born virus gets out, high population densities combined with mass transit and international air travel assures have a global pandemic before the locals know they have a problem. There are two solutions. First, locate such facilities in areas with low population densities that lowers the risk of breakout and buys time if leak. Second, for air-borne viruses improve ventilation where high densities of people to limit… Read more »