The case for preventive measures

By Iris Hunger, December 26, 2007

My fellow discussants repeatedly allude to the same reason to explain why we need to keep
sequence data public: Not doing so would harm the advancement of science and would prevent the
development of top-quality medicine, including for biodefense purposes. This implies that they
would limit the availability of this information, were it not for this “do no harm to science”
rule. I would argue, instead, that limits are inappropriate because, quite simply, they do not
work. They will not prevent access: A dedicated terrorist will find a way to get the information he
or she wants. But even if limits prevented access, they would not remove the threat. If we remove
sequence data for traditional biological weapons agents such as smallpox from the public arena, who
says that an enemy will not use regular influenza as a weapon? Plus, limits are not manageable
globally. Who is to decide what information has to be limited? Who is and who is not allowed to
have access to restricted information?

If scientists are serious about not wanting certain data available publicly, they need to think
before they act. If it is likely that they will produce knowledge that they do not want to share
with others, they simply should not produce it. The concept is preventive arms control. Scientists
have to ask themselves whether it is worth it to take certain paths of research. In other words,
they need to make a risk-benefit assessment before starting a project. Once a research project has
been conducted, it should be published in full.

On a second point: the price of expertise. Both Gigi Gronvall and Jens Kuhn say that expertise
can be bought, and therefore it is not a serious limiting factor in bioweapons development.
History, however, teaches otherwise. The Japanese cult Aum Shinrikyo had trained scientists and a
lot of money, and they did not successfully create a bioweapon. Al Qaeda certainly has a lot of
money. In the past, they could convince a PhD-level Pakistani microbiologist to provide them with
information, but tacit knowledge was not for sale. Other experts that could be bought, such as
bioweapons experts from the former Soviet Union, are also not for hire. Most of them are in Western
research projects–mostly civilian–not in Iran, Syria, or North Korea.

Lastly, both Leonid Ryabikhin and Jens Kuhn ask to “continue to explore possible ways to control
bioinformation” and to “broaden our thinking.” Well, do it! Please! I would like to hear new ideas;
it does not seem to be easy. There is a lack of innovation about how to control the use of the
diversified, globalized, and dual-use technologies, equipment, and knowledge in the life
sciences.

In my view, a key principle has to be that we do not try to prevent certain activities from
happening, but that we simply watch what people do. If someone or some country is engaged in a
suspicious activity that might indicate bioweapon development–and this will be rare–governments
and civil society should ask questions about intentions, whether the suspicious activities are
taking place in the United States, Germany, Iran, or Malawi; or whether they are taking place in
government, industry, or military facilities.

Scientists and governments need to be able to enter into a proper dialogue about these types of
activities. Knowing that a gene synthesizer is located somewhere tells you about capabilities, but
only through dialogue will we be able to determine its intended use.



 

Share: [addthis tool="addthis_inline_share_toolbox"]