The authoritative guide to ensuring science and technology make life on Earth better, not worse.
By John Cook | March 13, 2017
Over the last year, facts have taken a bit of a beating. Oxford Dictionaries added insult to injury when it named “post-truth” the 2016 word of the year. In early 2017, the phrase “alternative facts” entered public discourse, when US presidential adviser Kellyanne Conway used it on air to describe falsehoods the White House press secretary had told about the number of people who watched Donald Trump’s inauguration.
But misinformation is not a recent phenomenon. Back in 2014, the World Economic Forum listed “the rapid spread of misinformation online” as one of ten major trends affecting world events. For those of us already engaged with the issue of climate change, “post-truth” and “alternative facts” have a distinct ring of familiarity. Climate scientists have wrestled with false purported “facts” about climate change for decades. And it turns out that those who reject the scientific consensus on climate change possess many of the same traits as those who embrace post-truthism and alternative “facts” on other issues, including conspiratorial thinking and incoherent worldviews.
Fortunately there is a silver lining to these many years of climate change denial: There is now a large and instructive body of research into understanding why people dismiss science. Social scientists are even studying ways to counter misinformation and stop the spread of denial. Psychological research offers insights into why alternative facts are dangerous, and also a path toward countering them.
The insidious danger of alternative facts. We tend to think misinformation is dangerous simply because it misinforms, but alternative facts also have a more insidious influence. The danger was highlighted in a tweet by Russian chess-master Garry Kasparov, who wrote, “The point of modern propaganda isn’t only to misinform or push an agenda. It is to exhaust your critical thinking, to annihilate truth.”
Misinformation and alternative “facts” can cause people to suspend judgement. When presented with two conflicting pieces of information—fact and alternative “fact”—people often don’t know which to believe, so they choose to believe neither. This dynamic was demonstrated in a recent psychology experiment by Sander van der Linden and his colleagues, published in the February 2017 issue of Global Challenges.
The researchers presented different groups of study participants with different messages about climate change. One group was told the fact that 97 percent of climate scientists agree that humans are causing global warming. When people learned of the overwhelming scientific consensus, both their acceptance of climate change and perception of scientific agreement on the matter increased.
Another group was presented with misinformation designed to cast doubt on the scientific consensus on climate change. The text came from a website, The Global Warming Petition Project, which features a petition signed by 31,000 people with science degrees stating that humans aren’t disrupting the climate. This supposedly large number of dissenting scientists is used to argue that there isn’t a scientific consensus on climate change, despite the fact that 99.9 percent of the signatories aren’t climate scientists. (A recent analysis found that over six months in 2016, a story about this petition claiming that global warming is a hoax was the most shared climate-change article on social media.) The researchers found that showing text from the petition website to study participants lowered acceptance of climate change and the perception of consensus.
Where the new study gets interesting is in what happened with a third group. The researchers presented its members with both the 97 percent consensus, and the misinformation about the 31,000 dissenting scientists. This group showed no change in its acceptance of climate change science. The two conflicting pieces of information—fact plus alternative fact—cancelled each other out. Not knowing which information to believe, people chose to believe neither.
Kasparov nailed it when he characterized misinformation as annihilating truth. Fact and alternative fact are like matter and anti-matter. When the two collide, there is an explosion of heat and light, leaving behind nothing.
Therein lies the danger of alternative “facts.” To do damage, they don’t need to convince people of their veracity, or be coherent or evidence-based. They just need to exist. Presenting an alternative to the facts can be sufficient to stop people from believing in facts.
Evidence that evidence isn’t enough. There was one final twist in the experiment. The researchers tested an additional climate message on a fourth group. As well as a fact and an alternative fact, they included a warning about how the misinformation was used in an attempt to cast doubt on the truth. They told group four that virtually none of the 31,000 scientists who signed the misleading petition possessed the relevant expertise in climate science, and that these fake “experts” had been used to cast doubt on the expert consensus. In addition, the researchers explained to the fourth group that although 31,000 seemed like a large number, it was only 0.3% of the number of Americans with a science degree.
This part of the study was based on a branch of psychological research known as inoculation theory. It is analogous to vaccination, in which people acquire immunity against a disease by being exposed to a weak form of it. Similarly, a number of studies have found that people can develop resistance against misinformation by being exposed to weakened versions of it. In other words, once people can see through the techniques being used to mislead them, those techniques no longer work.
The researchers found that the misinformation was mostly neutralized by the inoculating text. And interestingly, the effect was the same across political ideology. Typically, conservatives are less likely to accept the human role in climate change, and more influenced by misinformation about climate change. However, inoculation is just as effective with Republicans as it is with Democrats. Nobody likes to feel like they’ve been misled by fallacious arguments, regardless of where they sit on the political spectrum.
Alternative facts are dangerous because people aren’t able to resolve the conflict between fact and alternative fact. Inoculating messages resolve the conflict and help people determine which is fact and which is falsehood. This research tells us that in a post-truth world, facts are necessary but insufficient.
We can protect our facts by packaging them with inoculating messages that explain how they get distorted. This might involve explaining the logical fallacy or technique used to misinform. Alternatively or additionally, it might require explaining the motives of the person giving the misinformation.
This dual form of science communication—coupling fact with inoculation—raises science literacy and increases critical thinking. In this current post-truth environment where alternative facts abound and science is under threat, it is more imperative than ever than we take an evidence-based approach to science communication. The evidence tells us that evidence is not enough—we also need to protect our science before we send it out into the world.
The Bulletin elevates expert voices above the noise. But as an independent nonprofit organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.
Keywords: climate change
Topics: Climate Change, Columnists, Science Denial