The authoritative guide to ensuring science and technology make life on Earth better, not worse.

A Framework for Tomorrow’s Pathogen Research

Final Report

Chairs

Ravindra Gupta

Ameenah Gurib-Fakim

Shahid Jameel

David Relman

Directors

Jesse Bloom

Filippa Lentzos

February 2024

Trust-building

Across academia, industry, and governments, many scientists work in service of publics. An important part of that work, often unrecognized, is enabling oversight and advisory bodies that deliberate over the risks and potential benefits of new research and technology. The role of scientists, including the nature of their research, how it is regulated, and how that research has or has not benefited the common good, is not well communicated. In the prevailing climate of misinformation and disinformation, it is more important than ever for members of the scientific community to think deeply about who they need to engage and how to earn trust (see Box 2). Many scientists see themselves as well-intentioned purveyors and defenders of scientific truths. However, without thoughtful communication and trust-building, scientists often further alienate those who do not trust science and its practitioners. This can in turn fuel extremist narratives or conspiracy theories rather than build bridges and encourage more moderate and informed viewpoints on polarizing scientific issues.

Research that could risk the emergence of novel pathogens and the prediction and quantification of that risk can be controversial even among scientists. The responsibility to educate stakeholders (including publics) about policies and practices for safe and secure research and to improve these policies and practices can seem to be a time-consuming distraction from scientific research. It can also introduce the risk of potential harassment. Adequate resources to support scientists in anticipating future problems and deciding how, when, and what information to share with publics are often not available (Mejlgaard et al. 2020). In the worst-case scenario, scientific controversies can lead to long-lasting mistrust in scientists and associated institutions, as well as governments.

The first step for scientific organizations and institutions to earn the trust of publics and other stakeholders will be to ensure that pathogen research is safe, secure, and responsible (as delineated in previous sections of this report and its recommendations).

Aspiring to trustworthiness

Responsible science entails an obligation towards public engagement (WHO 2022), and research with pandemic risks requires extra attentiveness to communication. This is important because lack of trust in science and scientists can have grave consequences. A 2023 Pew Research Center survey found significant loss of public confidence in scientists among Americans in general, with only about 11 percent of Republicans and 37 percent of Democrats showing a great deal of confidence in scientists to act in the best interests of  publics (Kennedy and Tyson 2023). One study from Yale University of excess deaths in Florida and Ohio found they were 43 percent higher in April-December 2021 among Republican voters compared to Democratic voters (Wallace, Goldsmith-Pinkham, and Schwartz 2023). A Pew Research Center study found a similar trend: Counties that voted Republican reported less trust in medical science and substantially more pandemic deaths than those that voted Democrat in the presidential elections of 2020 (Hope-Hailey 2014; Jones 2022). One interpretation of these studies is that the consequences of low confidence in science and scientists not only harmed those who do not trust science but also harmed their communities.

A lack of scientific literacy is often cited as a reason why many do not trust scientists. For instance, a 2021 survey of more than 2,000 adults in the United Kingdom found that those with extremely negative attitudes towards genetic technologies tended to have low textbook knowledge but high confidence in their own understanding (Fonseca et al. 2023). This highlights the need for science communication to address the gap between what people objectively know and what they believe they know. Conversely, many people do trust other scientific technologies without understanding them. For example, people receive medical treatments without understanding how they work (presumably because they trust the intent of medical doctors). This suggests that scientific literacy, or a knowledge deficit, may not be the primary determinant of whether an individual trusts scientists to act in the best interests of publics.

What has been observed more recently is a lack of trust in the process, motivations, and politics surrounding emerging areas of science. This is an important reason for scientists (from diverse backgrounds) to demonstrate that they are honest purveyors of knowledge who care about people’s perspectives and concerns. Moreover, it is vital that scientists speaking to publics are transparent about reasonable perceived influences and conflicts of interest.

There are generally agreed-upon characteristics of trustworthy leaders that are important when communicating complex science to publics: (1) competency, including knowledge, skill and ability; (2) virtues, including wisdom, justice, compassion, courage, integrity, honesty, empathy, and selflessness; (3) consistency, i.e., reliability and predictability in approach; and (4) engagement, including being respectful of others and their knowledge and perspectives in a non-conceited and non-elitist manner, and being connected with the community impacted by their research by directly and clearly communicating challenges, motivations, and solutions (Mayer, Davis, and Schoorman 1995).

Scientists must embrace the above values to facilitate safe, secure, and responsible research leading to technologies that promote the common good. Ethical considerations must be incorporated into research design along with a commitment to minimize potential risks to health, safety, and security. Key elements of managing risk and enhancing trust include developing biorisk prevention and management systems and practices; defining and identifying high-risk research; and ensuring that there is appropriate scrutiny and oversight to effectively mitigate potential harms as well as transparency with regards to associated risks. All of this behooves the research community to institutionalize effective and trustworthy communication with policymakers and journalists.

In highly competitive research fields, including pathogen discovery and manipulation, there is no incentive to share data or research plans prior to publication since this could cause researchers to lose their competitive edge. This challenge is heightened when there is unequal capacity, funding, or resource allocation and distribution of benefits (such as, publications and recognition or profits from products developed because of the collaboration). One specific problem is data-sharing. Scientists who collect novel pathogens may be disadvantaged and lose their head start if they share these discoveries with collaborators or other scientists with more resources and ability to publish in prestigious journals. Under these circumstances, journals, databases, and funding agencies have powerful roles and obligations to enforce timely data sharing, research integrity, and equitable outcomes.

Attentiveness to trust-building will require engaging experts in science communication and policymaking to reshape how scientists carrying out research with pandemic risks should interact with different publics, groups, journalists, other scientists, and policymakers.

BOX 2: Additional reading for building trust

World Health Organization (WHO) 2022 Global guidance framework for the responsible use of the life sciences: mitigating biorisks and governing dual-use research. World Health Organization.  LINK.

Cochrane Convenes. 2022. Preparing for and responding to global health emergencies. Cochrane Convenes. LINK and Cochrane Convenes. 2023. How to communicate scientific uncertainty: A Lifeology and Cochrane collaboration. Cochrane Convenes. LINK.

The National Academies of Sciences, Engineering, and Medicine (NASEM). 2015. Trust and Confidence at the Interfaces of the Life Sciences and Society: Does the Public Trust Science? A Workshop Summary. LINK.

Pamuk Z. 2021. Politics and Expertise: How to Use Science in a Democratic Society. Princeton, NJ: Princeton University Press. LINK

Next section ⇢