The authoritative guide to ensuring science and technology make life on Earth better, not worse.

The layered, Swiss cheese model for mitigating online misinformation

By Leticia Bode, Emily Vraga | May 13, 2021

Protesters mach against public health restrictions during the COVID-19 pandemic, including one woman carrying an anti-vaccine placard. Credit: Becker1999. CC BY 2.0.

The layered, Swiss cheese model for mitigating online misinformation

By Leticia Bode, Emily Vraga | May 13, 2021

COVID-19 has threatened the world with the worst pandemic in a century (Steenhuysen 2021), resulting in more than 100 million cases and more than 2 million deaths worldwide (WHO 2021). And despite the spectacular scientific achievement of developing multiple safe and effective vaccines in record time (Petri 2020), the world is not out of the woods yet. Accompanying the pandemic itself is what the World Health Organization has dubbed an “infodemic”—an overwhelming surplus of both accurate information and misinformation (WHO 2020).

In general, most information circulating online is accurate. One study, for example, found that only about 1 percent of the links about COVID-19 that a sample of voters shared on Twitter were to “fake news” sites (Lazer et al. 2020).  However, several common myths have persisted during the so-called “infodemic”—for example, the conspiracy theory that researchers created the virus in a Wuhan laboratory (Fichera 2020), or that 5G cell phone towers are responsible for its spread (Brown 2020). Newer myths relate to vaccine development, like the one about COVID-19 vaccines being made from fetal tissue (Reuters 2020).

For a variety of methodological reasons, it’s hard to say how many people believe the misinformation they see online and form what social scientists call misperceptions. People report varying levels of belief in certain prominent myths. About 22 percent of people, as of last August, thought that the virus was created in a Chinese lab and 7 percent of people thought the flu vaccine could increase the chance of getting COVID-19 (Baum et al. 2020). Misperceptions matter: Belief in conspiracy theories related to the virus is associated with people being less willing to get vaccinated (Baum et al. 2020). And from mask wearing to vaccination, COVID-19 public health measures are about protecting not just the individual, but also others in society; one person’s decision to forgo a vaccine is a risk to everyone else. There’s no silver bullet to countering the online misinformation that can lead to these sorts of consequential misperceptions, but the good news is that interventions like correcting false information can work.

Information overload.

Social media platforms are now a major way people get news (Pew 2019b), but it is often the most sensational or emotional content that people engage with most (Marwick 2018).  Misinformation is frequently just that sort of novel and emotional content tailor made for virality, and research has shown it can spread faster than truthful content (Vosoughi, Roy, and Aral 2018). While platforms are making significant progress in moderating the content pushed out by billions of social media users, even with very high accuracy of automated content evaluation, misinformation will still circulate (Bode 2020). The problem, quite simply, is one of scale.

An infodemic is problematic not just because of the presence of misinformation, but because of the abundance of information in general. People can have trouble sifting through this information and distinguishing good from bad. Compounding matters is that people are hugely and understandably interested in the pandemic—57 percent of people were paying close attention to COVID-19 as of September, and 43 percent reported finding it difficult to find information (Associated Press/NORC 2020). When demand for information outpaces the supply of reliable information, it creates a data deficit, where, as Tommy Shane and Pedro Noel write in First Draft,  “results exist but they are misleading, confusing, false, or otherwise harmful (Shane and Noel 2020).”

The fact that COVID-19 is a new virus exacerbates these issues. Researchers and medical practitioners are still learning new things about the novel COVID-19 virus and public health recommendations are constantly evolving. While this is an unavoidable characteristic of the scientific process, exposure to conflicting health messages can not only create confusion but also lead people to distrust health recommendations (Nagler 2014). Researchers can most clearly define misinformation when there is clear expert consensus and a large body of concrete evidence (Vraga and Bode 2020a). COVID-19, especially in its early days, lacked both of these features, making misinformation harder to identify and thus harder to address.

RELATED:
In the UK, a dangerous escalation in the criminalization of climate protests
A model for mitigating misinformation.
Swiss cheese model for misinformation mitigation. Credit: Matt Field (based on work by Ian M. Mackay). CC BY 4.0.

How to confront an infodemic and correct misinformation.

Given the scale of these overlapping problems, no single solution to the COVID-19 misinformation problem will do. Instead, much like the “Swiss cheese model” of layering defenses against COVID-19 itself—vaccines, masks, social distance, ventilation, etc. (Mackay 2020)—multiple overlapping misinformation interventions can help. As a recent Scientific American article put it, “every layer in the model—blocking on platforms, fact-checking, online engagement, and creation of a science-friendly community—has limitations. Each additional layer of defense, however, slows the advance of deceptions (Hall Jamieson 2021).” The answer is not correction, inoculation, media literacy, content moderation, or deplatforming—it is all of these things.

Importantly, this also means that everyone has a role to play in mitigating the spread and impact of misinformation. To solve the misinformation problems of the COVID-19 infodemic, public health authorities need to promote clear and reliable information (Malecki, Keating, and Safdar 2021). Social media platforms can engage in content moderation (Myers West 2018) and content labeling (Clayton et al. 2020), as well as employing interventions to nudge (Pennycook et al. 2021) people away from misinformation and towards accurate information (Bode and Vraga 2015). Public officials need to depoliticize scientific issues and invest in education (Bolsen and Druckman 2015). And educators serve an important role in inoculating students against misinformation (Banas and Rains 2010) and engaging in media literacy efforts (Vraga, Tully, and Bode 2020).

While these are all important, the average social media user can play a part, too.

Across more than 10 studies, our research has consistently shown that even when anonymous social media users correct misinformation they can reduce the misperceptions of the (sometimes large) audience of social media onlookers who witness the interaction (Vraga and Bode 2020b ).

Why might this be?

First, social media fosters weak social ties as opposed to strong ones (De Meo et al. 2014). Weak ties tend to represent a more diverse group than people with whom someone has strong ties; compared to people someone sees and interacts with every day in offline life, these weak ties may be more likely to bring novel perspectives or information to a discussion (Granovetter 1973). In the context of misinformation, this could mean diverse social media contacts are better able to recognize misinformation and have the information needed to correct it, whereas closer ties might not have been exposed to or believe that corrective information.

Second, the threaded nature of social media means that audiences see corrections essentially simultaneously with the misinformation. Research shows that the shorter the time between misinformation and correction, the more effective the correction is (Walter and Tukachinski 2020). Essentially, misinformation has less of a chance to take hold in someone’s mind if it is immediately corrected.

Third, even just observing corrections on social media may remind people about the potential social or reputational cost of sharing misinformation (Altay, Hacquin, and Mercier 2020). No one likes being wrong—which is one reason why many people resist corrective efforts. Being corrected can cause people to engage in motivated reasoning (Kunda 1990) to explain away the threatening piece of information. In the context of misinformation, this sometimes means that people will not accept correction of misinformation that aligns with their worldview.

But the people witnessing the correction of someone else are less emotionally involved than the person being corrected, and may be more amenable to accepting the correction. They see the reputational cost being imposed on someone sharing misinformation (Altay, Hacquin, and Mercier 2020), which can reinforce existing societal norms that value accuracy.

RELATED:
Searching for nuclear bombs at the Democratic convention

Research consistently shows positive effects of this sort of intervention (Vraga and Bode 2020b). Everyday social media users therefore have a clear role to play in mitigating the negative effects of misinformation.

What’s the most effective way to correct misinformation?

First, expertise and trust both matter for corrections. That expertise can be personal or organizational—a well-known health organization is going to be more effective in responding to misinformation about health given their perceived expertise (Vraga and Bode 2017). But that expertise can also be borrowed, such as when users provide links to these credible and trusted sources (Vraga and Bode 2018). And trust might matter even more than expertise when it comes to correction (Guillory and Geraci 2013), meaning that close friends and family on social media may be especially well-positioned to correct misinformation (Margolin, Hannak, and Weber 2018).

Second, repetition can be important as well. Misinformation is often sticky in part because of its familiarity—and familiar information feels more credible (this is called the “illusory truth effect”) (Fazio et al. 2015). Corrections need to be as memorable as the misinformation they are addressing, and repetition can make correction more memorable in the same way it does for misinformation. This is especially important when corrections come from social media users. Multiple users should correct misinformation when they see it to emphasize that public support is behind the facts, not the falsehood (Vraga and Bode 2017). As part of this repetition, people should emphasize the correction itself, not the misinformation—which they should reference only to demonstrate exactly why and where the misinformation is wrong (Ecker, Hogan, and Lewandowsky 2017). Simply telling people what is wrong does not work as well as also telling people what is correct. For example, stating that a COVID-19 vaccine will not give a person COVID-19 is less convincing than explaining that current vaccines do not contain any live virus and therefore cannot make you ill with COVID-19 (CDC 2021).

Finally, corrections do not need to be confrontational or cruel in order to be effective (Bode, Vraga, and Tully 2020). Offering empathy and understanding as part of a response to misinformation is equally effective in reducing misperceptions and might make the interaction more palatable for everyone involved (Hyland-Wood et al. 2021).

While many people have expressed concerns about correcting others on social media (Tandoc Jr., Lim, and Ling 2020), overall people tend to appreciate and even like the idea of correction of misinformation on social media. Recent surveys we’ve conducted suggest majorities of the US public hold favorable attitudes towards user correction on social media, including a belief that it is part of the public’s responsibility to respond (Bode and Vraga 2020). That isn’t to say people are oblivious to the possible downsides of correction, including the possibility of trolling or confusion. But recognizing that a large percentage of the public says that they have corrected misinformation on social media and that correction is valuable should reassure everyone that correction is not a social taboo. Indeed, the bigger harm to reputation is likely to come from sharing misinformation, rather than from correcting it (Altay, Hacquin, and Mercier 2020).

To sum up, people think it’s valuable to correct each other on social media, and it works when people do it. As a result, we hope that future interventions can focus on how to increase the number of people engaging in correction and ensure that those who do are well informed. Efforts like #iamhere, that mobilize and empower users to counter hate speech and misinformation on social media, are one model for this sort of approach (iamhere international 2021).

Together, layering peer correction with other approaches including content moderation, media literacy, content labeling, inoculation, and deplatforming can help stem the tide of misinformation.

References:

Altay, S., A. S. Hacquin, and H. Mercier. 2020. “Why do so few people share fake news? It hurts their reputation.” New Media & Society. https://journals.sagepub.com/doi/abs/10.1177/1461444820969893?journalCode=nmsa

Associated Press/NORC. 2020. “State of the Facts 2020: COVID-19.” Associated Press/NORC, October 20. https://apnorc.org/projects/state-of-the-facts-2020-covid-19/

Banas, J. A., and S.A. Rains. 2010. “A meta-analysis of research on inoculation theory.” Communication Monographs77 (3). https://www.tandfonline.com/doi/abs/10.1080/03637751003758193

Baum, M., K. Ognyanova, H. Chwe, A. Quintana, R.H. Perlis, D. Lazer, J. Green et al. 2021. “The COVID States Project #14: Misinformation and vaccine acceptance.”  OSF. https://doi.org/10.31219/osf.io/w974j

Bode, L. 2020. User Correction as a Tool in the Battle Against Social Media Misinformation. Georgetown Law Technology Review. 4  (367). https://georgetownlawtechreview.org/user-correction-as-a-tool-in-the-battle-against-social-media-misinformation/GLTR-07-2020/

Bode, L. and E.K. Vraga, 2015. “In related news, that was wrong: The correction of misinformation through related stories functionality in social media.” Journal of Communication65 (4): 619-638. https://onlinelibrary.wiley.com/doi/abs/10.1111/jcom.12166

Bode, L. and E.K. Vraga, 2020. “Americans are fighting coronavirus misinformation on social media.” The Washington Post, May 7. https://www.washingtonpost.com/politics/2020/05/07/americans-are-fighting-coronavirus-misinformation-social-media/

Bode, L., E.K. Vraga, and M. Tully. 2020. “Do the right thing: Tone may not affect correction of misinformation on social media.” Harvard Kennedy School Misinformation Review June 11. https://misinforeview.hks.harvard.edu/article/do-the-right-thing-tone-may-not-affect-correction-of-misinformation-on-social-media/

Bolsen, T., and J. N. Druckman, 2015. “Counteracting the politicization of science.” Journal of Communication 65 (5): 745-769. https://onlinelibrary.wiley.com/doi/10.1111/jcom.12171

Brown, M. 2020. “Fact check: 5G technology is not linked to coronavirus.” USA Today April 23. https://www.usatoday.com/story/news/factcheck/2020/04/23/fact-check-5-g-technology-not-linked-coronavirus/3006152001/

(CDC) US Centers for Disease Control and Prevention. 2021. “Myths and Facts about COVID-19 Vaccines.” Mar. 11. US Centers for Disease Control and Prevention March 11.  https://www.cdc.gov/coronavirus/2019-ncov/vaccines/facts.html

Clayton, K., S. Blair, J.A. Busam, S. Forstner, J. Glance, G. Green, A. Kawata.  2020. “Real solutions for fake news? Measuring the effectiveness of general warnings and fact-check tags in reducing belief in false stories on social media.” Political Behavior 42 (4): 1073-1095. https://link.springer.com/article/10.1007/s11109-019-09533-0

De Meo, P., E. Ferrara, G. Fiumara, and A. Provetti. 2014. “On Facebook, Most Ties Are Weak.” Communications of the ACM 57 (11): 78-84. https://cacm.acm.org/magazines/2014/11/179820-on-facebook-most-ties-are-weak/fulltext

Ecker, U. K., J.L Hogan, and S. Lewandowsky. 2017. “Reminders and repetition of misinformation: Helping or hindering its retraction?” Journal of Applied Research in Memory and Cognition 6 (2): 185-192. https://www.sciencedirect.com/science/article/abs/pii/S2211368116301838

Fazio, L. K., N.M. Brashier, B.K. Payne, and E.J. Marsh. 2015. “Knowledge does not protect against illusory truth.” Journal of Experimental Psychology: General 144 (5): 993. https://www.apa.org/pubs/journals/features/xge-0000098.pdf

Fichera, A. 2020. “Report Resurrects Baseless Claim that Coronavirus Was Bioengineered.” Factcheck.org September 17.  https://www.factcheck.org/2020/09/report-resurrects-baseless-claim-that-coronavirus-was-bioengineered/

Granovetter, M. S. 1973. “The strength of weak ties.” American Journal of Sociology 78 (6): 1360-1380. https://www.jstor.org/stable/2776392?seq=1

Guillory, J. J., and L. Geraci. 2013. “Correcting erroneous inferences in memory: The role of source credibility.” Journal of Applied Research in Memory and Cognition 2 (4): 201-209. https://www.sciencedirect.com/science/article/abs/pii/S2211368113000752

Hall Jamieson, K. 2021. “How to Debunk Misinformation about COVID, Vaccines and Masks.” Scientific American April 1.  https://www.scientificamerican.com/article/how-to-debunk-misinformation-about-covid-vaccines-and-masks/

Hyland-Wood, B., J. Gardner, J. Leask, and U.K. Ecker. 2021. “Toward effective government communication strategies in the era of COVID-19.” Humanities and Social Sciences Communications 8 (1): 1-11. https://www.nature.com/articles/s41599-020-00701-w

iamhere international. 2021. “Our Mission” iamhere international. https://iamhereinternational.com/.

Kunda, Z. 1990. “The case for motivated reasoning.” Psychological Bulletin 108 (3): 480–498. https://doi.org/10.1037/0033-2909.108.3.480

Lazer, D., D.J. Ruck, A. Quintana, S. Shugars, K. Joseph, N. Grinberg, R.J. Gallagher et al. 2021. “The COVID States Project #18: Fake news on Twitter.”OSF. https://doi.org/10.31219/osf.io/vzb9t

Mackay, I.M. 2020. “The Swiss cheese infographic that went viral.” Virology Down Under December 26.  https://virologydownunder.com/the-swiss-cheese-infographic-that-went-viral/.

Malecki, K. M., J.A. Keating, and N. Safdar. 2021. “Crisis communication and public perception of COVID-19 risk in the era of social media.” Clinical Infectious Diseases 72 (4): 697-702. https://academic.oup.com/cid/article/72/4/697/5858208

Margolin, D. B., A. Hannak, and  I. Weber. 2018. “Political fact-checking on Twitter: When do corrections have an effect?” Political Communication 35 (2): 196-219. https://www.tandfonline.com/doi/abs/10.1080/10584609.2017.1334018?journalCode=upcp20

Marwick, A. E. 2018. “Why do people share fake news? A sociotechnical model of media effects.” Georgetown Law Technology Review 2 (2): 474-512. https://georgetownlawtechreview.org/why-do-people-share-fake-news-a-sociotechnical-model-of-media-effects/GLTR-07-2018/

Myers West, S. 2018. “Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms.” New Media & Society 20 (11): 4366-4383. https://journals.sagepub.com/doi/abs/10.1177/1461444818773059

Nagler, R. H. 2014. “Adverse outcomes associated with media exposure to contradictory nutrition messages.” Journal of health communication 19 (1): 24-40. https://pubmed.ncbi.nlm.nih.gov/24117281/

Pennycook, G., Z. Epstein, M. Mosleh, A.A. Arechar, D. Eckles, and D.G. Rand. 2021. “Shifting attention to accuracy can reduce misinformation online.” Nature. https://www.nature.com/articles/s41586-021-03344-2

Petri, W. 2020. “COVID-19 vaccines were developed in record time – but are these game-changers safe?” The Conversation November 20. https://theconversation.com/covid-19-vaccines-were-developed-in-record-time-but-are-these-game-changers-safe-150249

Pew. 2019b. “Americans Are Wary of the Role Social Media Sites Play in Delivering the News.” Pew Research Center October 2. https://www.journalism.org/2019/10/02/americans-are-wary-of-the-role-social-media-sites-play-in-delivering-the-news/

Reuters. 2020. “Fact check: Lung tissue of an ‘aborted male foetus’ is not in the vaccine for coronavirus.” Reuters November 16. https://www.reuters.com/article/uk-factcheck-vaccine/fact-check-lung-tissue-of-an-aborted-male-foetus-is-not-in-the-vaccine-for-coronavirus-idUSKBN27W2I7

Shane, T., and P. Noel. 2020. “Data deficits: why we need to monitor the demand and supply of information in real time.” First Draft News September 28. https://firstdraftnews.org/long-form-article/data-deficits/

Steenhuysen, J. 2021. “Fauci says U.S. political divisions contributed to 500,000 dead from COVID-19.” Reuters February 22.  https://www.reuters.com/article/us-health-coronavirus-fauci/fauci-says-u-s-political-divisions-contributed-to-500000-dead-from-covid-19-idUSKBN2AM2O9

Tandoc Jr., E. C., D. Lim, and R. Ling. 2020. “Diffusion of disinformation: How social media users respond to fake news and why.” Journalism 21 (3): 381-398. https://journals.sagepub.com/doi/abs/10.1177/1464884919868325?journalCode=joua

Vosoughi, S., D. Roy, and S. Aral. 2018. “The spread of true and false news online.” Science 359 (6380): 1146-1151. https://science.sciencemag.org/content/359/6380/1146

Vraga, E. K., and L. Bode. 2017. “Using expert sources to correct health misinformation in social media.” Science Communication 39 (5): 621-645. https://journals.sagepub.com/doi/abs/10.1177/1075547017731776?journalCode=scxb

Vraga, E. K., and L. Bode. 2018. “I do not believe you: How providing a source corrects health misperceptions across social media platforms.” Information, Communication & Society 21 (10): 1337-1353. https://www.tandfonline.com/doi/full/10.1080/1369118X.2017.1313883

Vraga, E. K., and L. Bode. 2020a. “Defining misinformation and understanding its bounded nature: Using expertise and evidence for describing misinformation.” Political Communication 37 (1): 136-144. https://www.tandfonline.com/doi/abs/10.1080/10584609.2020.1716500?journalCode=upcp20

Vraga, E. K., and L. Bode. 2020b. “Correction as a Solution for Health Misinformation on Social Media.” American Journal of Public Health 110 (S3): S278-S280. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7532323/

Vraga, E. K., M. Tully, and L. Bode. 2020. “Empowering users to respond to misinformation about Covid-19.” Media and Communication 8 (2): 475-479. https://www.cogitatiopress.com/mediaandcommunication/article/view/3200

Walter, N., and  R. Tukachinsky. 2020. “A meta-analytic examination of the continued influence of misinformation in the face of correction: How powerful is it, why does it happen, and how to stop it?” Communication Research 47 (2): 155-177. https://journals.sagepub.com/doi/abs/10.1177/0093650219854600

(WHO) World Health Organization. 2020. “Infodemic.” World Health Organization. https://www.who.int/health-topics/infodemic#tab=tab_1

(WHO) World Health Organization. 2021. “WHO Coronavirus (COVID-19) Dashboard.” World Health Organization https://covid19.who.int/

Together, we make the world safer.

The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.


Get alerts about this thread
Notify of
guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

ALSO IN THIS ISSUE

RELATED POSTS