Stopping dangerous research before it starts

By Jens H. Kuhn, January 18, 2008

This discussion needs some clarification. Iris Hunger is right in stating that the expertise to build and deploy a biological weapon is distributed among only a few people who are difficult (but not impossible) to recruit. However, so far, this discussion has focused on what can be done with available sequence information, i.e. constructing a pathogen using these data. As I have pointed out, the synthesis of certain viruses is not laborious or technically difficult, nor does it require extensive training. Thankfully, the remaining steps to a weapon (quantitative production, concentration, purification, stabilization, and dispersal) are much harder to accomplish.

I agree that a dedicated terrorist may find a way to get the information he or she wants, for example, by acquiring a pathogen and sequencing its genome–if genomic information is not publicly available. The question that should be steering this discussion is whether not-so-dedicated aggressors could easily utilize sequence information to synthesize pathogens. In my view, dedicated and financially supported terrorists would not need publicly available sequence information to begin a weapons program, and not-so-dedicated terrorists may manage to synthesize a pathogen only to get stuck in the weaponization process for which they would have to hire those hard-to-come-by experts. Therefore, I argue that the availability of natural pathogen sequences is not dangerous.

Iris did, however, raise an important point that has not been discussed as much as it deserves: “If it is likely that [scientists] will produce knowledge that they do not want to share with others, they simply should not produce it.” Scientists have generated possibly dangerous genomes deliberately (e.g., the 1918 H1N1 influenza A virus) and by serendipity (e.g. the mousepox IL-4 virus). Both genomes do not occur in nature anymore. Should oversight systems have prevented these experiments, so as to not make sequences available that are not of direct public-health use? Should special rules apply to these experiments if they are performed, to restrict the public availability of such sequences?

The concept that there might be some information not worth knowing is anathema to scientists, but research decisions are based on many different considerations. Scientists try to fit their research into the prevailing funding environment so that it can be seen as consistent with national priorities. Many scientific questions that would generate new information are thus left unanswered, due to either lack of expected scientific merit, potential for publication, or, most often, available financial support. This leads me to question whether funding authorities and scientists who construct novel pathogens, and thereby increase the risk that terrorists repeat the same experiments with malevolent goals, have their priorities always straight.

To begin addressing these issues, I think that scientists and policy experts should work together to install an oversight system akin to that proposed by the Center for International and Security Studies at Maryland that evaluates, accepts, rejects, or modifies proposed scientific projects with potential high-risk outcomes before any such research is carried out. There should be consensus that a research project is important before it ensues, i.e. the public-health benefit outweighs the risks. This way, as Iris says, once an experiment is performed, its findings can be published in full.

As I pointed out before, I don’t think that screening oligonucleotide orders will be useful at this time because the process is likely to generate too many false-positive alarms. We already established that the synthesis of pathogens can be done only by scientists, so licensing scientists would be of questionable benefit, since terrorists would then simply hire a licensed scientist.



 

Share: [addthis tool="addthis_inline_share_toolbox"]