Horsepox synthesis: A case of the unilateralist’s curse?

By Gregory Lewis | February 19, 2018

Horsepox is a virus brought back from extinction by biotechnology. In 2016, the Canadian researcher Ryan Noyce, along with colleagues, synthesized a copy of horsepox DNA from the virus’s published DNA sequence, placed it into a cell culture alongside another poxvirus, and in this way recovered copies of “live” horsepox virus.

Horsepox is a close cousin of the vaccinia virus used as a smallpox vaccine, so resurrected horsepox could be used to make another, and potentially better, smallpox vaccine. Yet synthesizing horsepox presents a glaring dual-use concern: Smallpox itself is also closely related to horsepox. If horsepox can be synthesized “from scratch” with the published sequence, couldn’t someone do the same for smallpox?

The publication last month of Noyce’s horsepox work has proven controversial. Noyce and his colleagues have argued that demonstrating the feasibility of synthesizing horsepox will inform and advance the biosecurity conversation around smallpox. Others—such as Tom Inglesby, director of the Johns Hopkins Center for Health Security—have disagreed: Generating a risk to show it is indeed risky seems a dangerous path, and the potential benefits of a better smallpox vaccine may prove poor compensation for the increased possibility that a malicious actor could cause an artificial smallpox outbreak.

Beyond the immediate issue of whether the horsepox work should have been performed (or published), the horsepox synthesis story highlights a more general challenge facing dual-use research in biotechnology: the unilateralist’s curse.

Curses. Imagine that 100 scientists are individually deciding whether it would be a good idea to synthesize horsepox. All of them act impartially and in good faith: They would only conduct this work if they really thought it was on balance good for humankind. Each of them independently weighs up the risks and benefits of synthesizing horsepox, decides whether it is wise to do so, and acts accordingly.

The unilateralist’s curse—an idea proposed by Nick Bostrom and colleagues at Oxford—arises from the fact that, if synthesis of horsepox is not to occur, all 100 scientists must independently decide not to pursue it; while if any of the scientists judges the benefits to outweigh the risks, he or she acts unilaterally to synthesize horsepox. Thus there is an “action bias”: Horsepox synthesis is more likely to occur when scientists act independently than when they agree to a decision as a group.

The “curse” part of the unilateralist’s curse emerges from this asymmetry: If horsepox synthesis is good, a scientist mistakenly thinking it is bad will have little impact as long as one of the other scientists, (correctly) thinking it is good, synthesizes horsepox. Yet if horsepox synthesis is bad, just one scientist who mistakenly thinks it is good can produce a bad outcome, even though the scientist’s peers all (correctly) recognize it as a mistake.

The table below might make this situation clearer, with the bottom right corner representing the unilateralist’s curse.

Horsepox synthesis is good

Horsepox synthesis is bad

Scientists do not synthesize horsepox

All scientists (incorrectly) see horsepox synthesis is bad

All scientists (correctly) see horsepox synthesis is bad

Scientists synthesize horsepox

At least one scientist (correctly) sees horsepox synthesis is good

At least one scientist (incorrectly) sees horsepox synthesis is good

The challenge of the unilateralist’s curse is that decisions about whether to pursue potentially harmful research are left to the most optimistic outlier. In cases where the most optimistic view is that research should be performed, but the median view is that it should not, the research probably should not be performed. Yet due to the unilateralist’s curse, it will probably happen anyway.

Other factors can exacerbate the threat posed by the unilateralist’s curse:

  1. The number of potential actors who can act unilaterally: If 10,000 scientists are wondering whether to conduct a piece of concerning research, the chance that one of them will mistakenly think it is a good idea is greater than if there were only 100 scientists (or 10, or one).
  2. Time: As time passes, the opportunity for a mistake to be made increases.
  3. The difficulty of assessing a threat (or individuals’ poor ability to assess a threat): If scientists are more prone to committing errors in judgment where assessing benefit and risk are concerned—either because of the inherent difficulty of assessment, or their own poor ability to conduct accurate assessment—the chances increase that one scientist will greatly underestimate a danger.
  4. Conflict of interest: No matter how wide the spectrum of reasonable disagreement over a given piece of concerning research might be, the spectrum of unreasonable disagreement is still wider. Scientists may be motivated to pursue potentially dangerous work by the prospect of fame or monetary reward, inducements that may color their judgement about what is best for the common good. (The authors of the horsepox paper are listed as co-inventors on the patent relating to their work, and so stand to gain monetarily from any commercial applications.)
  5. Tragedy of the commons: If scientists suspect that another scientist might pursue dangerous work for personal gain, the implicit agreement among them not to perform such work is fragile. Scientists may think, “Well, if someone is going to do this work anyway, I might as well be the one who gets the benefit.”

It is hard to say for sure whether the unilateralist’s curse applies to the horsepox synthesis case—it remains unclear whether in fact horsepox synthesis is bad, and in any event scientists seldom report experiments they decide not to perform. A circumstantial argument can nonetheless be made that the unilateralist’s curse did come into play.

The possibility that synthetic biology could be used to produce dangerous pathogens (including smallpox) has been expressly discussed for more than a decade. In 2014, the World Health Assembly commissioned a report on the public health implications of the capacity to recreate smallpox via synthetic biology. Once horsepox was in fact synthesized, a controversy erupted, demonstrating a spread of opinion among reasonable people about whether such work should have been done. The scientists responsible for the research, in the discussion section of their paper, suggested that their views lay toward the more optimistic end of the spectrum of opinion:

Given that the sequence of variola [smallpox] virus has been known since 1993, our studies show that it is clearly accessible to current synthetic biology technology, with important implications for public health and biosecurity. Our hope is that this work will promote new and informed public health discussions relating to synthetic biology, stimulate new evaluation of [horsepox]-based vaccines, and advance the capacity to rapidly produce next-generation vaccines and poxvirus-based therapeutics.

All this suggests that synthesizing poxviruses has been possible for a while—and that the possibility was finally realized by scientists particularly optimistic about the balance of benefit versus risk.

Vulnerable governance. In scientific research, certain safety nets could “catch” well-intentioned but mistaken scientists from performing, or publishing, hazardous research. Yet these safety mechanisms are themselves vulnerable, including to the unilateralist’s curse.

Funding bodies provide some centralized oversight and control over what research is performed, and most major funders have governance mechanisms to review research of dual-use concern. But not all biological research must undergo this scrutiny: Indeed, the horsepox work was funded by Tonix, a private enterprise hoping to commercialize the result. The increasing role of private funding in science, as well as the democratization of biological research (for example, citizen science initiatives), promise that more research will bypass this safety net.

Regulators—whether national governments, international bodies, or institutional biosafety committees at particular academic centers—form another safety net. But these entities are disjointed and cannot be relied upon to appreciate all potential dangers. As described in a 2016 World Health Organization report, the horsepox scientists noted that regulators:

[M]ay not have fully appreciated the significance of, or potential need for, regulation or approval of any steps or services involved in the use of commercial companies performing commercial DNA synthesis, laboratory facilities, and the federal mail service to synthesize and replicate a virulent horse pathogen.

The unilateralist’s curse enhances the risk of regulatory mistakes: Even if one university (or one country) blocks a piece of concerning research, other countries may not follow suit. As Inglesby notes, “[W]ork considered very high risk and therefore rejected by one country may be approved by others.”

Journals can decline to publish work if it contains hazardous information, as has been discussed in the past—for example, regarding gain-of-function influenza experiments. Yet journals act independently of one another, and so are at risk of unilateralism. Consider this reply, made by an editor at Science, to a pre-submission inquiry from the authors of the horsepox synthesis paper (emphasis added):

While recognizing the technical achievement, ultimately we have decided that your paper would not offer … readers a sufficient gain of novel biological knowledge to offset the significant administrative burden the manuscript represents in terms of dual-use research of concern.

If responsible and highly resourced journals such as Science are eager to “bounce” dual-use research (and the headache of evaluating it) to someone else, this someone else may comprise publications with less capacity to make accurate judgements, or simply publications with more reckless editors—that is, journals more likely to mistakenly publish hazardous information. Unless journals communicate concerns about submitted work to one another, scientists can “shop around” to find the journal most likely to publish their work. (And ultimately, even if no journal publishes the work, self-publication and other methods of circulating scientific work make publication review a poor last line of defence.)

What can be done? The unilateralist’s curse is greatly attenuated when individuals form a consensus view about whether a given piece of research is too dangerous to allow, and when they follow this consensus even if they disagree with it. How can such an approach to scientific work be encouraged?

The risks of unilateralism would be blunted if: Scientists exhibited less willingness to strike out on their own when the “downside risk” of a research project were, instead of just failure, a public health disaster; researchers established a stronger community norm toward caution when contemplating research with health consequences for populations; and non-scientific stakeholders were more widely included in decision making.

Better governance can close the holes in the other safety nets. The export control regime known as the Australia Group incorporates a “no-undercut” principle, whereby countries are expected not to permit exports that another country has rejected, without consulting that other country first. Similar norms could be applied to funders, regulators (for example, institutional biosafety committees and institutional review boards), and journals. These collaborations could also consolidate decision making among a smaller number of better-resourced groups. The aspiration for a global governance regime is unattainable now, but steps in the right direction aren’t hard to identify.

Close to the precipice. One should be candid about the costs of restricting research: Scientific breakthroughs often occur when mavericks strike out against the consensus, and the future benefits of biotechnological progress will be considerable. Yet these benefits are increasingly fragile. The effort to eradicate smallpox cost approximately $1.7 billion, required the combined efforts of thousands of people, and has saved more than 40 million lives. Horsepox synthesis required three scientists and cost $100,000.

The eradication of smallpox is rightly seen as one of humanity’s great triumphs; the costs of a synthesized smallpox outbreak would be unconscionable. Yet we edge closer to this precipice, driven on by occasional and perhaps mistaken bursts of enthusiasm and curiosity. Smallpox is not the only dual-use risk posed by synthetic biology: Pathogens could be deliberately engineered as well as merely recreated, and further biotechnological advances may bring into view other risks, currently beyond the horizon. When so much is at stake, collective caution should be the guiding principle—not unilateral action. Not all discoveries justify courting global catastrophe.

Thanks to Joe Carlsmith, Tom Inglesby, Piers Millett, Cassidy Nelson, Carl Shulman, and Andrew Snyder-Beattie for their advice and criticism. Their kind help does not imply their agreement, and any mistakes remain my own.


Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Get alerts about this thread
Notify of
guest
0 Comments
Inline Feedbacks
View all comments