The authoritative guide to ensuring science and technology make life on Earth better, not worse.

How to confront emerging pathogens

Technological advances in the life sciences hold out the promise of controlling or eliminating stubborn diseases. They also increase the risk that malevolent actors will learn to produce new and highly dangerous pathogens, a prospect that deeply concerns security professionals in developed countries. In the developing world, meanwhile, where many nations struggle mightily with diseases such as AIDS and malaria, public health concerns tend to focus more on the here and now—or, when it comes to emerging threats, on how to contend with natural rather than manmade pathogens. Below, authors from Nigeria, South Africa, and Argentina address the following question: How can governments, institutions, and professionals in both the developed and developing worlds make the world safer from emerging pathogens, whether natural or manmade?

Round 1

Making bad data good

Outbreaks of emerging pathogens, whether natural or manmade, are not just health issues. They present challenges along several other dimensions—legal, political, economic, and military. Improved disease surveillance is at the heart of meeting many of these challenges.

Disease surveillance entails gathering information about the current and past prevalence of diseases in specific countries and regions and provides context necessary for understanding new disease outbreaks and determining, among other things, if they are natural or manmade. Such information is gathered, for the most part, by national health organizations, which forward it to entities such as the World Health Organization, the Pan American Health Organization, and the Centers for Disease Control and Prevention. There, information is compiled into databases and published. But it's not always good information.

I have worked closely with disease statistics for more than a decade. Sadly, I have found that information about disease prevalence, whether in the developed or developing worlds, often contains discrepancies. For example, Argentina, Brazil, and Bolivia were reporting cases of dengue fever some years ago at a time when Paraguay was not. From a geographical and disease-transmission perspective, this was impossible. The discrepancy was not resolved until Paraguay provided updated numbers.

Faulty disease statistics are quite troubling in a world where international travel is easy and controls on passenger health are loose. They are troubling because urbanization has created very poor areas where crowding, inadequate hygiene, and low-nutrition diets allow diseases to flourish. Inaccurate statistics also represent a major impediment to national initiatives in both public health and defense. If you cannot say with confidence which diseases exist in your country, how can you develop a strategic plan to fight disease? How can you gauge the impact of diseases on your population? How can you prevent malevolent individuals from doing harm with deadly pathogens? Better data about the prevalence of disease are also essential for helping researchers perform their jobs more effectively and fulfilling commitments to instruments such as the Biological and Toxin Weapons Convention. If more effective systems for disease surveillance are to be established, improvements are necessary in two main areas: the legal and institutional realm, and the educational realm.

An international legal and institutional framework already exists for emerging diseases, comprising a health component and a weapons component. On the health side, the World Health Organization oversees efforts to combat many diseases. On the weapons side, the Biological and Toxin Weapons Convention, the Geneva Protocol, and the Australia Group (an informal export-control entity) are intended to inhibit the development of biological weapons. But there is considerable overlap between the two realms—for example, yellow fever and dengue fever are relevant both to the health side and the weapons side. This leads to much duplicated effort, as individuals and institutions responsible for disease reporting often must report data to multiple entities. Thus the odds increase that data discrepancies will appear. These problems could be remedied if international health and weapons institutions came to regard disease surveillance as a natural point of contact between their respective endeavors.

But no matter how close the coordination might be between health-oriented and weapons-oriented organizations, data on disease outbreaks will remain flawed if—as is often the case—local health professionals lack the education and training they need to fulfill their duties in disease surveillance. Nations and individual health professionals sometimes lack the ability to identify diseases. Sometimes they don't clearly understand which diseases to report, or whether to report individual cases or only major outbreaks. They may be unsure whether to report all cases or only fatalities. And there may be confusion about whether reports should be filed yearly, monthly, or whenever a worrisome event occurs.

The good news is that these problems can be resolved through improved education for health professionals and officials. The bad news is that little is being done at the international or national levels to improve education. Entities such as the World Health Organization should develop educational programs to address these problems, monitor their performance, and coordinate their efforts with related national initiatives. But bottom-up efforts can also be useful. For example, the University of Bradford has developed excellent tools that can be used to train health care professionals on issues such as reporting data to the Biological and Toxin Weapons Convention. Creating a world safer from emerging pathogens requires that players at all levels—from municipal to national governments, and from regional organizations to global ones—do their part to improve systems for disease surveillance and reporting.

Combatting pathogens through ethics education

Pathogens do not respect borders. Efforts to control outbreaks of disease—including those caused by emerging pathogens, whether natural or manmade—require that local and national responses be synchronized and that mechanisms for international cooperation be established. At both the national and international levels, scientists play an important role.

Scientists, because of their engagement with pathogens in daily research and their immersion in the literature that surrounds them, are well positioned to act as an early warning system for disease outbreaks. Their involvement in, and endorsement of, measures for biosafety, biosecurity, and dual-use control are vital to confronting the threats posed by emerging pathogens. ("Biosafety," in simple terms, refers to preventing unintentional exposure to pathogens, or their unintentional release. "Biosecurity" means protecting pathogens from theft, loss, or diversion. "Dual-use" research is legitimate research that could be misused to threaten public health or national security.)

In their role as a first line of defense, scientists often find themselves enthusiastically embraced by those who care about pathogen control. But serious questions surround the education that scientists receive in the ethics of biosafety, biosecurity, and dual-use research, and in associated standards for responsibility, professionalism, and good conduct. Ethics education often fails to prepare scientists to meet the expectations that are placed on them, and indeed numerous studies indicate that ethics education, on a global level, remains patchy and unstandardized. Until international ethics education becomes more comprehensive, it is difficult to see how scientists can be cast with real confidence as first-line defenders.

Ethics education for scientists must be focused so that scientists can see value in the topics under discussion—that is, ethics education requires buy-in from the scientists. Without buy-in, ethics education can aspire to little more than providing scientists with information about the risks associated with biosafety, biosecurity, and dual-use issues. It cannot encourage scientists to engage in critical reflection about the misuse of research and to practice ethical decision making.

Getting buy-in from scientists is not so difficult when it comes to biosafety. The biosafety risks associated with research, and scientists' responsibility to address these risks, are a fairly straightforward and generally manageable element of the ethics curriculum. But making scientists properly aware of security risks—whether these risks fall into the biosecurity or the dual-use category—is a far more complicated process. Scientists often perceive these risks as quite removed from their daily research practices (and in most cases there is little that scientists can actually do to reduce the risk that their work will be misused). If scientists are presented with responsibility for reducing risks that they perceive as unfounded, the efficacy of ethics education can be severely undermined. Thus ethics education must walk a fine line—encouraging ethical decision making and critical reflection without exaggerating the security risks inherent in research.

Low- and middle-income countries often suffer from a persistent lack of formalized ethics education. The majority of the ethics education that scientists receive in these nations is made available because funding requirements or collaboration agreements require that it be provided (or else it comes in the form of online courses). Scientists therefore receive ethics education that is highly generalized—or, if specific, is specific to the research context of a high-income country. These decontextualized ethics initiatives often discuss types of research that are outside the national research remit of low- or middle-income countries—and scientists often struggle to see the point in discussing risks that fall outside their frames of reference. Moreover, highly visible problems such as poor health care provision may overshadow security concerns in low- or middle-income countries, further complicating ethics education.

Another problem facing education initiatives in low- and middle-income countries is that these nations do not have, nor are they likely to develop soon, comprehensive structures allowing scientists to report concerns about biosafety, biosecurity, and dual-use issues. Such structures in wealthier nations may not be perfect, and they remain the subject of considerable discussion, but they do exist. So those involved in ethics education must be very careful not to present developing-world scientists with responsibilities that they have no way of fulfilling. Initiatives in ethics education must strike the right balance between responsibility and risk; otherwise, scientists will see biosecurity and dual-use issues as simply not pertinent to their research.

An important step toward confronting the threat of emerging pathogens is to develop improved educational approaches for life science ethics. As such approaches are developed, it must be continually reiterated that one size cannot fit all—any successful model for ethics education must contend with issues such as the cultural environment in which the model is to be applied and the practicalities of specific research environments. (Then again, the international harmonization of ethics education is an important issue in its own right.) Until ethics education can overcome such challenges, it will be very difficult to establish an international community of scientists—scientists appropriately aware of biosafety, biosecurity, and dual-use issues—who truly can act as a united first line of defense against emerging pathogens.

National responsibilities, cooperative possibilities

Governments and scientific communities in the developed world devote considerable attention and study to the emergence and re-emergence of pathogens. But in resource-constrained countries, this is often not the case. Africa is a region especially prone to outbreaks of the diseases naturally transmitted between vertebrate animals and humans (zoonotic diseases); these include Ebola, Rift Valley Fever, and plague. All countries in the region are at risk from these diseases, and cross-border outbreaks are frequent. But African nations are often characterized by a failure or inability to effectively address the emergence of new diseases or the re-emergence of endemic ones.

Several reasons for this stand out. Africa's systems for disease surveillance are weak and laboratory support is poor, making it difficult to produce data needed for assessing disease burdens and responding with appropriate priorities. When good information is unavailable, the emergence of new pathogens is often met with denial—until a disease outbreak reaches epidemic proportions. Once an epidemic is under way, an affected country is invaded by international health agencies, but they operate in panic and crisis-response mode, and their efforts amount to too little, too late. Pathogen outbreaks ultimately become opportunities for foreign researchers and health agencies to fine-tune their skills, leaving scientists in resource-poor countries permanently dependent on outsiders—reduced to mere sample collectors, unable to control the next pathogen outbreak on their own.

How can these challenges be overcome? The first step involves focusing on local processes of pathogen emergence. Pathogens emerge under widely varying environmental, demographic, and socioeconomic circumstances. A pathogen's ability to emerge or re-emerge depends on factors including genetic changes or adaptation in the pathogen itself; environmental conditions associated with climate, economic development, and land use patterns; and issues related to pathogens' human hosts, including demographics, international trade patterns, misuse of antibiotics, people's occupational exposure, the neglect of public health services, and bioterrorism. These factors and conditions interact differently in different parts of the world—therefore, the first step toward preventing and controlling outbreaks of emerging and re-emerging diseases is to gain a thorough understanding of local processes of pathogen emergence. Once such an understanding is gained, governments, institutions, and professionals—especially in the developing world—must commit themselves to clearly defined, proactive roles in the fight against emerging and re-emerging diseases.

At the national level, in particular, it is essential that each country take "ownership" of systems for disease surveillance, prevention, and control; this allows country-specific response measures to be formulated. Taking ownership of these systems entails making a genuine political commitment to them, and requires that adequate resources, financial and human, be provided for disease surveillance and for laboratory support systems. It is crucial that nations maintain systems capable of detecting, identifying, and containing pathogens that have epidemic potential before they spread too widely.

For governments, taking a proactive role in combating disease also entails implementing appropriate emergency response plans; coordinating collaborative interactions between human and veterinary health surveillance systems; building and sustaining the disease-fighting capacity of local health personnel by providing them training, opportunities to update their skills, and an empowering work environment; and establishing a multidisciplinary approach to disease control, one that allows individuals from diverse fields to bring their expertise to bear on the control of emerging or re-emerging diseases. (Engagement from the private sector, for example, ought to be forthcoming because disease outbreaks threaten everyone's economic security.)

Making the world safer from emerging and re-emerging pathogens also requires a great deal of global collaboration. Nations should collaboratively implement policies, for example, that control pathogens' ability to spread via modern transportation systems. They should participate in regional surveillance and response activities. They should share real-time surveillance information in order to detect zoonotic diseases in animal populations before they appear in human populations. Meanwhile, researchers all over the world should collaborate to strengthen surveillance of human populations that are at high risk of contracting zoonotic diseases. Science-based nongovernmental organizations, because of their wide geographic reach and their field expertise, should be engaged as partners to help provide comprehensive surveillance and response capabilities. Meanwhile, innovative mechanisms could be established that would ensure adequate funding for sustain­able global disease surveillance systems. The struggle against emerging and re-emerging disease is a complex and difficult challenge that requires full-scale effort at both the national and international levels.

Round 2

Planning for the entire disease cycle

So far in this roundtable, the authors have presented a number of valid and largely correct arguments. But the conversation has been atomized, with each author focusing on issues that fall within his or her own experience or area of expertise. This atomization is symptomatic of problems within the global health system, where inadequacies in communication and coordination can plague efforts to control emerging pathogens. Often, stakeholders in health systems don't know who the other key players are, what those individuals' responsibilities are, and how to go about working together.

To remedy this, nations must establish systemic approaches to battling pathogens. One way to make efforts systemic is to organize them around the stages of a disease's evolution—that is, to make plans corresponding to the times before, during, and after an outbreak.

Before an outbreak, the most important issues are to evaluate readiness and to predict the needs that will emerge once an outbreak occurs. Accordingly, officials must gather good statistics regarding past and present epidemiological conditions. Needs for equipment and personnel must be identified, and those responsible for resource allocation must make appropriate decisions. Scientists, medical doctors, nurses, and even politicians must receive the education and training that they will need in the event of an outbreak; all stakeholders must be included. Response plans must be formulated in line with plausible outbreak scenarios. Finally, those responsible for treating disease (doctors, nurses, and administrative personnel) and those who study disease (scientists in research laboratories) must prepare to work collaboratively, and they must be provided incentives as necessary. Changing health care's need-to-know culture into a need-to-share culture is fundamental.

Outbreaks put to the test the steps taken during the preparatory stage. During an outbreak, officials once again must gather good statistics. They must decide where to allocate emergency resources including money, equipment, instruments, professionals, and so on. Political decision makers must provide support for disease-fighting efforts. The mass media must do its part to provide necessary information to the public.

The aftermath of an outbreak amounts to a learning opportunity, a chance to judge the adequacy of plans made before the outbreak and of implementation during it. Evaluations made at this time—not only of health care itself, but also of communications efforts and the political aspects of disease response—should be fed back into planning for the next outbreak. Thus a new cycle begins.

All this sounds fairly straightforward, but pathogens' refusal to stay within one country complicates everything. Regional and international cooperation is imperative—but if it's difficult to build a successful response program within a single nation, doing so across several nations presents enormous difficulties. This is where the World Health Organization, with its ability to coordinate national efforts, plays a very important role.

In any event, it is critical that the goals of disease-fighting efforts be defined in objective terms; that an appropriate level of commitment be given to meeting those goals; and that individuals responsible for following through on plans be held accountable if they fail to fulfill their duties. In the struggle against emerging pathogens, there is a lot to lose and a lot to gain. Winning the battle depends on making good plans, implementing them well, assessing successes and failures, and incorporating what is learned into plans for the future.

Why don’t problems get fixed?

In Round One, the authors participating in this roundtable identified several problems that must be addressed if developing countries are more effectively to detect and respond to emerging and re-emerging diseases—or prevent and detect malevolent or accidental releases of pathogens. The ideas presented in the three essays are important. But they are not new to discussions on disease control.

Oyewale Tomori correctly identified several weaknesses in nations’ health capacities, including inadequate systems of disease surveillance and poor laboratory support. He proposed that nations "take ‘ownership’ of systems for disease surveillance, prevention, and control," and also urged governments to share information and resources. Maria José Espona focused on the integrity of the data that surveillance systems provide and argued that health professionals, particularly in developing countries, often exhibit low capacity to produce accurate data. She believes that education represents a big part of the solution to emerging pathogens. We, the authors, believe likewise, and we argued in Round One that improved ethics education for scientists is crucial.

But for years, the problems identified in Round One have received attention in international health discussions, particularly in the context of the Biological and Toxin Weapons Convention (BTWC) and the World Health Organization (WHO). The concerns raised by Oyewale and Espona are addressed to a large extent in the 2005 International Health Regulations, a legally binding international agreement negotiated under the auspices of the WHO, which specifically requires states to "develop minimum core public health capacities." In support of this requirement, the WHO’s Department of Global Capacities Alert and Response works to enhance national and international capacities in disease surveillance and response. As for ethics education and awareness-raising for scientists, these issues have been identified in the context of the BTWC, over at least the last eight years, as critical in preventing malevolent uses of the life sciences.

Shortcomings in national capacity, disease surveillance, and ethics education are well understood. So what stops them from being addressed successfully? The likely answer is that these shortcomings are symptoms of larger problems in the global health system. One such problem is weak or corrupt governance, as addressed by Tomori in Round Two. Another is the relatively low priority given to public health at the regional, national, and international levels.

But in addition, while the BTWC has the potential to provide a framework for international cooperation and exchanges in health, its effectiveness is undermined by its current inertia—which can be traced to the 2001 collapse of negotiations to establish a verification mechanism for the treaty. The BTWC, like other disarmament treaties, is also vulnerable to larger international political tensions. Today, decision-making processes in the treaty context are all but nonexistent.

In 2011, ahead of the Seventh Review Conference for the treaty, co-author Gould, along with Jeremy Littlewood and Gigi Kwik Gronvall, argued that one way to cure the BTWC’s malaise might be to more vigorously implement the treaty’s Article X, which encourages states to share knowledge and technologies. Such sharing could, among other things, improve disease detection and response. Gould, Littlewood, and Gronvall recommended—and this remains as relevant today as it was in 2011—that knowledge and technology exchanges between developed and developing nations no longer be cast as interactions between donors and hapless recipients.

Indeed, when it comes to disease surveillance and response, developing countries can give as much as they gain in exchanges with wealthier nations. But as things stand now, developing countries don’t view the BTWC as a useful forum through which to raise concerns and share knowledge. This produces something of a chicken-or-egg problem, because if the treaty is to become truly relevant to scientists and publics in emerging countries—and more effective on a global level—developing nations need to put their own stamp on proceedings. Discussions at the BTWC that do not take into account the concerns of developing countries produce linguistic and conceptual gaps that ultimately induce apathy. This problem must be overcome if all parties to the treaty are to participate in it effectively.

Unfortunately, there are only a tiny handful of civil society groupings around the world that follow the proceedings of the BTWC. A similarly insignificant number of national activist organizations campaigns for improved public health services. So while the treaty could offer a powerful tool for fostering international discourse on disease control, something is lacking—the political pressure and sense of purpose needed to effect changes that could counter the spread of disease.

Health in Africa: Corruption and misplaced priorities

In April 2001, member states of the African Union met in the Nigerian capital of Abuja and pledged that, by 2015, each nation would devote at least 15 percent of its governmental expenditures to public health. Prospects for meeting that goal seem poor. As of 2009, the proportion of government expenditures devoted to health had actually declined in 11 African nations. As of 2011, though the proportion of government expenditures devoted to health had increased across the continent (to 11 percent from 9 percent), only six countries had reached the 15-percent goal.

Many African governments blame inadequate public health funding on poverty. But the real culprits are corruption and misplaced priorities—which guarantee that delivery of health care is poor, surveillance systems to detect emerging and re-emerging pathogens are ineffective, and efforts to control disease often end in failure. The AIDS and Rights Alliance for Southern Africa, a regional network of nongovernmental organizations, runs a campaign that draws attention to the spending choices that African governments make. The alliance reports that some governments, instead of providing adequate funds for health, education, and other services that would better the lives of their people, devote exorbitant sums to frivolous expenditures. The government of Swaziland has spent $500,000 on a luxury car for the king. Uganda has spent $48 million on a private presidential jet. Zimbabwe spent $250,000 on a lavish celebration of the president's 85th birthday. Senegal has spent $27 million on a bronze statue taller than the Statue of Liberty—and a proposed new city gate to Abuja would cost $395 million. None of these nations has met its 2001 commitment regarding health care spending.

In most African countries, implementation of the disease surveillance activities that are required for early detection of emerging pathogens remains defective at both the local and national levels. For example, a recent assessment of disease surveillance and response implementation in Nigeria's Kaduna State revealed that 38 percent of the state's health facilities had no standard case definition for priority diseases, 71 percent lacked a computer and printer, and 81 percent carried out no analysis of data they collected. In Africa, poor surveillance and data management mean that months can often pass between the beginning of an outbreak and the time it is first reported to health authorities. Even then, underreporting is likely to be rampant—epidemiological investigations often reveal many more cases than were reported through surveillance systems. The African Union has estimated that corruption costs African economies about $150 billion each year. A fraction of that money could provide every nation in Africa with an efficient disease surveillance system and a high-quality laboratory network to support it.

In Round One, Louise Bezuidenhout and Chandre Gould discussed ethics education as a critical element in allowing scientists to act as a first line of defense against emerging pathogens—but where corruption flourishes to such an extent, there is little chance for ethics to survive or thrive. Developing countries must get their priorities in order and be held accountable for their health expenditures. If each country did what it is really capable of—in enhancing disease surveillance, improving laboratory support, and efficiently managing data—the world would be much safer from emerging and re-emerging pathogens.

Round 3

Spotlight on South America

The fight against emerging pathogens can appear in very different lights depending on who you are and where you live. In this roundtable, Africa’s efforts to control emerging pathogens have received a lot of attention, while my own region of South America has received less. So I’d like to devote my last essay to illuminating the South American picture.

From the perspective of emerging pathogens, a major difference between South America and Africa is that, when Spain and Portugal began conquering South America in the sixteenth century, they merged with local populations to a very large extent. They traveled not just to the coasts but to most areas of the continent. They imposed their religion and their legal systems. They established new population centers. This meant that they also introduced new diseases—which devastated indigenous populations. But because the Europeans intermarried with local people, immunity to many pathogens was also introduced.

In South America today, because of migration that is motivated by the desire to work or study, or by family considerations, populations continue to mix with one another at a high rate. As South Americans travel from place to place, they often carry no records of their disease history—but they do bring pathogens with them, or interact with new disease vectors along the way. Once they are established in their new locations, they tend to exhibit behaviors—regarding their own health care, for example, or food preparation—that they brought from their previous homes. Pathogens can thrive under such conditions, and diseases such as dengue fever and yellow fever are endemic in a range of South American countries. Meanwhile, the mobility of South American populations makes it very complicated to carry out disease surveillance, determine a disease’s epidemiological status, or harmonize policy among nations.

Nonetheless, South America’s health system has a lot going for it. Advantages include well-organized ministries of health. They include a cadre of well-trained doctors and scientists, as well as good basic infrastructure such as high-quality laboratories and hospitals devoted to infectious diseases. (Unfortunately, these resources tend to be concentrated in big cities; small towns and rural areas are underserved in comparison.) South America’s advantages include regional organizations such as Unasur, which provide forums for discussing health challenges and formulating responses. Advantages also include a shared regional history and a set of shared values, factors that make cooperation across borders relatively easy.

Nonetheless, South America could do a much better job of containing emerging pathogens. With the region continuing to struggle against urgent problems such as poverty and social inequality, inadequate governmental attention and resources are available for public health. In the long run, what South America needs is a common, systemic approach to the fight against emerging pathogens. Some elements of the continent’s health system work well, but the various pieces of the puzzle don’t fit together optimally.

Each of the world’s regions is unique, and each of its health systems is unique as well. Today, with forces such as urbanization and increased international travel presenting new challenges in the fight against emerging pathogens, it is crucial to understand the past and the present of every region. The future of health care depends on it.

The journey, not just the destination

In Round Two, Maria José Espona identified an "atomization" in this roundtable. The various authors, she wrote, were mostly emphasizing issues relevant to their own expertise. And she argued that the roundtable's "atomization" was symptomatic of the global health system's inadequacies in communication and coordination. In Round Three, Oyewale Tomori expressed deep frustration over African states' failure, as he sees it, to take responsibility for effective disease surveillance and response. He characterized African nations' reliance on international support as evidence of a "dependency stupor."

Espona and Tomori, in their different ways, highlighted the global health system's critical need to become more integrated, more responsible, and more responsive. But the vision that they effectively share represents a destination. Before the destination can be reached, it is necessary to recognize and gain a better understanding of a multiplicity of factors that impede progress (in addition to those already identified by Tomori and Espona). Several of these factors involve either inadequate information—regarding what works, what doesn't, and why—or a failure by health care and science practitioners to advocate forcefully for change.

In many cases, important issues surrounding the establishment and administration of health systems in developing countries are poorly understood. The broad forces that underlie these systems—historical, social, economic, and ethical—are rarely documented or investigated in an extensive fashion by in-country researchers. As a result, though it is easy to identify problems in a given country's health system, it can be hard to know where to lay the blame for problems, or how to remedy them. Also, poor understanding of local circumstances (and the factors that underlie them) often causes "donor" countries to offer or impose solutions that do not suit local contexts, or don't correspond precisely to local problems.

Conversely, a lack of in-depth data about a particular country's health system can also cause what does work—and why—to be overlooked. Not all developing countries share the same risks and capacity constraints or exhibit the same systemic failures. For example, the health systems in South Africa and Kenya differ considerably from those in the Democratic Republic of Congo and Guinea. Disease surveillance systems in the former nations have their shortcomings, but these countries’ capacities to detect and respond to disease outbreaks are far more robust than in countries that, for example, have experienced prolonged conflict. Thus, a lack of discussion about why some initiatives work makes it difficult to adapt successful strategies from one context to another and diminishes the potential to learn from successes.

An important dimension to these information gaps is that developing countries' health practitioners are often relatively silent and invisible in international policy forums. Because these practitioners are responsible for providing health services on a daily basis, they are ideally positioned to offer insights and propose remedies. When they don't contribute insights and remedies, or their contributions are not heard, outsiders often impose solutions that fail to address underlying problems, or that don't receive buy-in from policy makers and health care providers. Thus it is imperative that developing-country scientists, at conferences and in the context of collaborations, assert their needs and explain their realities—that they advocate for change more actively. A related stumbling block to progress is that many in the scientific community are reluctant to be vocal about—and active in—addressing weak governance and other systemic problems that hinder disease control efforts. It is crucial to determine why this reluctance to force positive change exists.

Similarly, national and international professional associations that are in a position to advocate for change could use their positions to greater effect if they pressured governments and donor organizations to heed the recommendations of scientists and health professionals. If they exerted such pressure, key issues might find their way onto national or international agendas. Such issues might include the difficulty of implementing improvements in biorisk management when, for example, core funding for laboratories isn't available (as is the case in many countries). Discussion of these issues is necessary if a clearer understanding of mismatches between health policy and health practice is to be achieved.

Ultimately, making the world safer from emerging pathogens requires a multi-faceted approach that includes improved bioethics educations (as we, the authors, discussed in Round One) and greater collaboration among developing countries (as we discussed in Round Two). It also requires health practitioners at the national, regional, and international levels to take more vigorous action to effect systemic change.

Wake up. Ease off.

Global efforts to contain emerging pathogens will continue to fail, and pathogens will continue to have a field day, as long as some countries shirk their responsibilities and others annex responsibilities that aren’t rightfully theirs. Clear and distinct roles must be defined for both developed and developing countries and, crucially, each country must be held accountable for lapses in performing its defined role.

In Round Two, Louise Bezuidenhout and Chandre Gould discussed the concerns about national capacities in disease surveillance and response that Maria José Espona and I, separately, had earlier raised. Bezuidenhout and Gould noted that concerns about capacity are "addressed to a large extent in the 2005 International Health Regulations." These regulations, as Bezuidenhout and Gould wrote, require states to "develop minimum core public health capacities." But in 2011, six years after the regulations were agreed upon, the World Health Organization reported that only 32 percent of African countries had established legislation in support of the regulations. Only 33 percent had established the capacities in human resources necessary for implementing the regulations.

Meanwhile, Africa’s perennial weakness in surveillance and reporting is demonstrated by an ongoing Ebola outbreak in Guinea. The first case in the outbreak appeared no later than December 2013 but Guinea did not notify the World Health Organization until March 2014. This represents no improvement over the Ebola outbreak that 19 years ago struck Kikwit, in the Democratic Republic of Congo (then Zaire). At that time, the interval between the outbreak and the notification of health authorities was about three months. When it comes to improvements in rapid response to pathogens, Africa has remained in the slow lane.

What’s responsible for this inability to rapidly contain emerging pathogens? In addition to corruption, and the failure to prioritize health issues—themes I addressed in Round Two—many developing countries seem to have settled into a condition of dependency regarding health and emerging pathogens. Developed countries, meanwhile, have a tendency to cling to sovereign control of disease surveillance processes. One example should illuminate my point.

During the 1995 outbreak of Ebola in Kikwit, an international study team led by the World Health Organization was on hand to help contain the epidemic. At the closing ceremony, a discussion took place about how to disburse the funds that countries around the world (a few of them African) had donated to control the epidemic. I suggested that some of the money be used to upgrade a laboratory in Kinshasa that the French government had begun to construct but subsequently abandoned. Upgrading the laboratory would have enhanced African scientists’ capacity to provide support for disease surveillance and would have enabled them to handle dangerous pathogens on their own. Without improved laboratory support, as I explained, any African country undergoing a future Ebola epidemic would have to call for external help, which would work to the detriment of local scientists. No one present, whether from a developing or developed country, took my suggestion or my prediction seriously. Since then, Africa has experienced over a dozen outbreaks of Ebola; the affected countries have included Gabon, Sudan, Uganda, and now Guinea. Africa has been unable to contain any of these outbreaks without resort to international aid. In 1995, African countries missed an opportunity to "own" the processes of disease surveillance. Developed countries missed a chance to surrender their control.

It isn’t that Africa has made no progress. Rather, progress has been too slow and too fragile. When the continent has registered successes, they’ve usually been built on a foundation of assistance from international agencies—assistance that comes with time limits.

African and other developing countries must wake up from their dependency stupor regarding health, including disease surveillance and prevention. African countries must make more vigorous efforts to develop the core capacities required under the International Health Regulations—and countries that don’t meet their agreed targets should face sanctions. Unfortunately, the International Health Regulations don’t allow for sanctions or other accountability measures. But, for example, travel restrictions could be imposed on individuals from countries where certain diseases are endemic. But above all, each African country must commit its resources to ensuring appropriate surveillance for emerging and re-emerging pathogens. Meanwhile, developed countries should ease off on the dominance, on their control of the processes of global disease surveillance. Only then can developing countries truly "own" these processes.



Topics: Biosecurity

 

Share: [addthis tool="addthis_inline_share_toolbox_w1sw"]

RELATED POSTS

Receive Email
Updates