The Oxford Dictionaries recently selected “post-truth” as their word of 2016, a reflection of the deluge of misinformation that the public has endured over the past election year. During the 2016 election’s latter weeks, fake news—that is, purely fabricated stories with no basis in fact—was shared more on Facebook than real news.
This surge in misinformation didn’t emerge from a vacuum. Industry and ideological organizations have been seeding public discourse with self-serving misinformation for many decades.
Consequently, scientists have long been grappling with how to distinguish misinformation from accurate information. The challenge is that misinformation typically masquerades as science. How do you distinguish between a genuine scientific critique and an outright denial of science itself—the former a process distinguished by its emphases on evidence, facts, experiment, and observation?
This is an important question. Philosophers of science see dissent and original thinking outside of the mainstream as a good thing, and scientific progress depends on it. For example, Albert Einstein upset our fundamental understanding of time and space with his theory of general relativity. Alfred Wegener broke the existing paradigm about the structure of our planet when he introduced continental drift. Australian scientist Barry Marshall overturned the conventional understanding of what causes stomach ulcers by famously drinking a petri dish of bacteria, leading to a Nobel Prize.
Detrimental scientific dissent. However, some forms of dissent have been harmful to scientific understanding and placed the public at risk. Since the mid-20th century, the tobacco industry has poured millions of dollars into deliberate campaigns of misinformation. These efforts have confused the public about the evidence linking smoking to lung cancer and delayed policies to regulate the tobacco industry.
Science philosophers Justin Biddle and Anna Leuschner divide scientific dissent into two categories: beneficial or detrimental to our scientific understanding. Some of the features of beneficial dissent are that it forces researchers to look closely at their methodology, reduces assumptions, and sharpens one’s conclusions.
Detrimental dissent, on the other hand, can impede scientific progress in several ways. First, it forces scientists to respond to an endless wave of objections and demands. These take the form of complaints, public and private attacks, or attempts to have studies retracted. Critics have used Freedom of Information requests to obtain scientists’ emails and in extreme cases, even hacked private correspondence.
The most prominent example of this type of cyber-attack is the so-called ‘climategate’ incident, where scientists’ emails were hacked and published on the Internet. Although nine independent investigations all concluded that there was no evidence of wrongdoing on the part of the scientists, the stolen emails (and subsequent investigations) continue to be held up by some as evidence of a global climate conspiracy.
Second, attacks on the scientific community create an atmosphere where scientists become fearful of the hostile response their research may receive. This in turn can influence, sometimes at a subconscious level, how they communicate their scientific findings—for example, by downplaying the severity of their results. One analysis found that predictions from the Intergovernmental Panel on Climate Change were 20 times more likely to underestimate future climate impacts than overestimate them.
This downplaying of scientific results is due to an asymmetry in how their research is challenged, with “alarmist” predictions being met with intense criticism while “harmless” predictions were met with less hostility. This behavior on the part of scientists is also known as “erring on the side of least drama.”
I’ve observed this behavior first hand. At a science conference, I was surprised when a climate scientist presented research results indicating a startling increase in extreme weather. This result was much stronger than the media reporting of the same research, due to the press release emphasizing the bottom range of the estimated climate impact rather than the best estimate. When I asked the scientist why he had emphasized the lower estimates, he replied that they were less likely to get attacked.
Identifying detrimental dissent. As decades pass, the difference between good-faith dissent and bad-faith misinformation can become clear with the benefit of hindsight. We look back at the Mad Men-esque ad campaigns of the tobacco industry—with advertisements that proclaim that “More Doctors Smoke Camels Than Any Other Cigarette” and that their products are “Made by Tobacco Men, Not Medicine Men”—with clear-sighted revulsion. But how does one distinguish such things while still in the middle of an ongoing controversy? After all, it is scarcely the fish that discovers water. Or put another way, how do we discern the difference between genuine scientific skepticism and science denial?
It’s deeply problematic trying to determine a person’s motives when he questions a prevailing scientific consensus. He may be driven by profit, ideology, or the pure motive of advancing scientific understanding. To tease out the difference between skepticism and denial, we are on more solid ground if we analyze behavior rather than speculate about motives.
Fortunately, there is a body of research into the phenomenon of science denial. Across a wide range of scientific disciplines, denial expresses itself in a surprisingly consistent manner. Whether it’s denial of the efficacy of vaccines, denial of the evolution of humans from lower life forms, or denial of the evidence for human-caused global warming, the same characteristics appear over and over again.
The five common characteristics of science denial are fake experts, logical fallacies, impossible expectations, cherry-picking, and conspiracy theories. In our course on climate science denial, we summarize these traits with the acronym FLICC.
Particularly problematic is conspiratorial thinking, which exhibits certain characteristic thought patterns. Conspiracy theorists consider the conspirators to be all-powerful and omnipresent, but also small in number. In their worldview, nothing happens by accident: Seemingly random events are incorporated into their grand narrative of an overarching conspiracy. Because the official account (e.g., the scientific consensus) must be wrong, they can adopt mutually contradictory positions so long as they both defy the official account.
Conspiracy theories are self-sealing by nature. When a conspiracy theorist encounters evidence disproving his theory, he assumes the conspirators generated the evidence. Consequently, presenting scientific evidence to those who deny a scientific consensus is largely futile and sometimes counterproductive.
What constructive dissent looks like. There is a way for non-scientists to participate in constructive, scientific dissent. Indeed, the public should be given the opportunity to engage in scientific debate, without having to engage in the strategies and techniques of denial. What might beneficial dissent look like?
One case study in the Journal of Social and Political Psychology demonstrates a constructive approach to engaging with the scientific community. Nicholas Brown was “essentially a stranger to academia,” but nevertheless established a dialogue with senior academics that had conducted key research in the field of positive psychology.
He subsequently collaborated with an expert to publish five scientific articles on the subject. They rebutted a cornerstone paper of the positive psychology movement and subsequent scientific discussions were published in the same journal. Brown and his co-authors characterized the process as follows:
…[t]he system worked as it should: Everyone remained calm and polite, the various publishing and appeals processes were tested and observed to work, the scientific record was corrected, the field of positive psychology took stock, and nobody felt the need to publish anyone’s home address or other personal details on the Internet.