Climate Change

How to respond to science-denial trolls

By John Cook, July 26, 2016

As a researcher writing about climate science denial, I am a magnet for online trolls. Climate change is an issue over which a small but vocal minority persists in rejecting and attacking the scientific consensus, specifically the finding by 97 percent of experts that humans are causing global warming. When I turn the spotlight around to expose the techniques of science denial, the reaction can be intense.

My last Bulletin column, which was on the topic of scientific consensus, offers a striking example. The story itself highlights a new study reaffirming that yes, there really is scientific consensus on the causes of climate change. One third of the comments were personal attacks, either towards me or other readers. They came from both sides, using terms such as “eco-troll slacktavist” and “paid shill.”One quarter of the attacks were directed at me, mostly involving claims of deception. I was accused of “dishonest rumor mongering” as well as being “blatantly dishonest” and an “expert in propaganda.”

Because my research focuses on misinformation and attacks on scientists, I have a unique perspective when my own work comes under fire. It’s not just an attack—it’s data! Examining the rhetorical techniques used in trolling comments also presents instructional opportunities. For example, I once explained how confirmation bias can lead people to cherry pick data, rely on non-experts who tell us what we want to hear, and resort to implausible conspiracy theories. I then predicted that these argumentative strategies would appear in the subsequent comment thread. Sure enough, they flooded in. This enabled other readers to participate in a valuable critical thinking exercise, identifying and responding to the denialist fallacies.

Unfortunately, the impact of trolls goes far beyond merely disrupting online discussion. Cyber-bullying can take an emotional toll on individuals or a whole community. Personally, my approach in online forums is to engage in good-faith discussions, which can lead to fruitful exchanges. But the moment a commenter demonstrates bad-faith or off-topic trolling, I disengage—adhering to the “don’t feed the troll” maxim.

Uncivil comments also have an influence beyond their direct targets, giving social license to like-minded readers to behave uncivilly—a phenomena I like to call “trickle-down trolling” (hat-tip to Mitt Romney). Uncivil comments can also polarize readers who might otherwise have taken a more nuanced view of the subject being discussed.

We see polarization caused by toxic dialogue at a broader public level, too, with US Republicans and Democrats moving further apart in their attitudes and beliefs about climate change. One analysis found that organizations receiving corporate funding have published an increasing amount of climate misinformation with the express goal of causing polarization. (To be sure, at this higher level, polarization is the result of a number of factors.)

To counter misinformation, colleagues and I launched an online course about climate science denial. It was important that the class discussion forum serve as a safe and constructive learning environment for our students, so the possibility that the forum might be disrupted by trolls was a serious concern.

Fortunately, there are a number of ways to mitigate the impact of trolling. We proactively put into place five strategies to mitigate potential disruption, summarised with the (admittedly somewhat contrived) acronym T.R.O.L.L.

Teamwork. Not all websites have the resources of the Huffington Post, which hired a 40-person comment-moderation team to help keep things civil. We solved this problem by developing a team of volunteer moderators across a range of time zones, ensuring that the global community in the discussion forums was monitored at all times. Constant communication via Skype kept all the moderators coordinated and up-to-date. We also recruited extra moderators from the student community, which was important to avoid burnout and maintain a sustainable, long-term team.

Rules. We defined and communicated clear guidelines. They included rules such as “no ad hominem attacks” and “no profanity or inflammatory tone,” with an emphasis on fostering constructive behavior rather than enforcing a particular viewpoint. (We adapted the rules from the comments policy at Skeptical Science, the product of years of experience and deliberation.)

Oversight. Consistently applying moderation guidelines sends an important signal to participants that they need to follow the rules in order to join in on the discussion.

Light Touch. It was also important that our moderators not be too heavy handed. The course aimed to raise students’ critical thinking skills, equipping them to identify the fallacies in denialist arguments and know how to respond. Rather than jump in to censor comments expressing denialist viewpoints, we encouraged students to use those comments as active learning opportunities, responding with arguments we had introduced in the lectures.

Log. As with Light Touch, another moderation strategy perhaps unique to our course was documenting all moderator activity in a Google spreadsheet. This created accountability among the moderators and allowed the moderation team to get up-to-date on the latest activity.

Our T.R.O.L.L. approach resulted in smoothly run discussion forums, providing a positive learning environment for the students. The presence of trolls dropped to a negligible level (from a peak of more than 100 posts per day early in the course to zero on some days) when they realized their efforts yielded little reward.

But what about a publisher who doesn’t have access to a team of moderators? One useful, low-cost tactic is removing the ability to post anonymously. When Huffington Post removed anonymity (by making people comment through their Facebook account), the number of offensive (and ALL CAPS) words in comments dropped. Interestingly, the number of typos also decreased.

Heeding the “don’t feed the troll” rule is also useful. Ignoring incendiary comments is one of the most effective ways of thwarting a troll’s attempts to stir the pot. The flaw in this strategy is that it only takes one reactive reader for a situation to escalate.

A third, relatively easy strategy is to educate readers about the nature of trolls—through an onscreen message or link to more information—who often target inexperienced users. Once trolling methods are explained, readers are better able to identify and avoid falling for their baiting behavior.

Trolls can have a destructive impact, disrupting online communities and damaging the reputation of publishers. However, a number of proactive strategies can reduce the damage they cause. Finding the right strategic mix can transform an editorial headache into a thriving community.

Below: Presentation at the University of Adelaide by Carrie Finn and John Cook, describing the moderation strategies employed in the Denial101x online course.

 

As the coronavirus crisis shows, we need science now more than ever.

The Bulletin elevates expert voices above the noise. But as an independent, nonprofit media organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Support the Bulletin