ASCB logo LSE Logo

Science Denial and the Science Classroom

    Published Online:https://doi.org/10.1187/cbe.12-03-0029

    Biology teachers are accustomed to engaging individuals who do not accept biological evolution. Denial of evolution ranges from ignorance of the evidence to outright denial or distortion of data. The list of science denial topics has grown alarmingly over the years to include: HIV as the cause of AIDS, exaggeration of the health and environmental risks of genetically modified organisms, existence of holes in the ozone layer, the rise in antibiotic resistance, health risks caused by cigarette smoking, exaggeration and denial of harmful side effects of pesticides, water and environmental damage caused by hydraulic fracturing, the fear that vaccines do more harm than good, and, of course, global warming and climate change. Teaching climate science has become so perilous in some school districts that the National Center for Science Education, long known for activism in the arena of evolution education, has greatly expanded efforts in the arena of climate (http://ncse.com/climate).

    In the face of denial by a substantial portion of society, the natural tendency of a scientist or educator is to pile on the data and examples, deepen explanations, and reason rigorously. However, such strategies may not be the most effective and are certainly not sufficient. In this feature, I will explore the nature and roots of science denial and review resources that instructors and students can use to recognize denial strategies while deepening their understanding of the scientific process. Filtering information is central to human nature, so there are issues of psychology, communication style, and the role of the media to consider. Clearly, no one is thoroughly rational and objective in all his thoughts and actions. Each of us is guilty of denying facts on occasion, perhaps when it comes to our pets, our health, or our favorite sports team. However, systematic and organized denial that employs strategies of persuasion to combat scientific evidence is another matter.

    Denialism is the systematic rejection of empirical evidence to avoid undesirable facts or conclusions. Various observers have distilled a number of common strategies employed by science denialists. The Skeptical Science website (www.skepticalscience.com/5-characteristics-of-scientific-denialism.html), focused primarily on global warming, has a concise list of denialist tactics that I paraphrase here:

    1. Conspiracy theories: Propose that a complex and secretive conspiracy accounts for an overwhelming body of scientific evidence and consensus. Question the quality of the science. Former South African President Thabo Mbeki based his denial of AIDS on conspiracy theories that appealed to legitimate historical perspectives.

    2. Fake experts: Highlight views inconsistent with established knowledge, often complemented by denigration of established experts, including questioning credentials, integrity, and motives. The tobacco industry employed industry-sympathetic scientists to attack mainstream research.

    3. Cherry-pick data: Draw on isolated papers that challenge the consensus view and neglect the broader body of research supported by hundreds of papers. A single paper (now discredited) suggesting a connection between vaccination and autism is a prime example that has contributed to a rise in measles in the United States and other countries due to fearful parents not vaccinating their children.

    4. Impossible expectations for research: Set unrealistic standards for data that invalidate an entire body of research. Since empirical science generally works with probabilities, denialists exaggerate uncertainty and gaps in data. The tobacco industry and antievolutionists have employed this tactic.

    5. Misrepresentation of opposition viewpoints: Employ logical fallacies, including an appeal to personal freedom or other core values. A common tactic is to associate an opponent with Nazi policy or even personal characteristics shared with Hitler or Stalin.

    The appeal to personal freedom has been marvelously lampooned in the book Thank You For Smoking by Christopher Buckley. A clip from the film version (http://movieclips.com/XYW5-thank-you-for-smoking-movie-ice-cream-politics; Figure 1) is well worth 2 min of your viewing time. It includes the following critical dialogue between a lobbyist father and a young son:

    Figure 1.

    Figure 1. The book and film Thank You For Smoking satirize lobbyists, particularly those working for the gun, tobacco, and liquor industries. The “ice cream cone” scene from the film is an engaging presentation of the denialist tactics of appealing to freedom and changing the subject.

    Joey: So, what happens when you're wrong?

    Nick: Well, Joey, I'm never wrong.

    Joey: But you can't always be right.

    Nick: Well, if it's your job to be right, then you're never wrong.

    Joey: But what if you are wrong?

    Nick: Okay, let's say that you're defending chocolate and I'm defending vanilla. Now, if I were to say to you, “Vanilla's the best flavor ice cream,” you'd say …?

    Joey: No, chocolate is.

    Nick: Exactly. But you can't win that argument. So, I'll ask you: “So you think chocolate is the end-all and be-all of ice cream, do you?”

    Joey: It's the best ice cream; I wouldn't order any other.

    Nick: Oh. So it's all chocolate for you, is it?

    Joey: Yes, chocolate is all I need.

    Nick: Well, I need more than chocolate. And for that matter, I need more than vanilla. I believe that we need freedom and choice when it comes to our ice cream, and that, Joey Naylor, that is the definition of liberty.

    Joey: But that's not what we're talking about.

    Nick: Ah, but that's what I'm talking about.

    Joey: But … you didn't prove that vanilla's the best.

    Nick: I didn't have to. I proved that you're wrong, and if you're wrong, I'm right.

    Joey: But you still didn't convince me.

    Nick: Because I'm not after you. I'm after them.

    While emphasizing the value of choice, the dialogue also illustrates the tactic of simply changing the subject. Several websites delve into denialist strategies a little deeper. The Hoofnagle brothers (one is a lawyer and the other a physiologist) have been at the forefront of identifying and analyzing denialism. They manage a blog (http://scienceblogs.com/denialism) that includes a more detailed primer on denialist tactics (http://scienceblogs.com/denialism/about.php). You will note a tone in the Hoofnagles' blog that departs from journalistic or scientific detachment. Blogs such as Skeptical Science (which has an app for countering climate denial arguments at www.skepticalscience.com/Newcomers-Start-Here.html) and the Hoofnagles' Denialism are positioned in opposition to the harsh tone of denialist blogs, such as the Forces pro-smoking website (www.forces.org) and the Heartland Institute (http://heartland.org), which denies global warming.

    Journalists and the mainstream media occupy an important position in mediating denialism messages. Despite the proliferation of blogs and social media, professional journalists have the ability to amplify and disseminate a message, and long-standing traditions of balanced reporting can work against true fairness and veracity. Denialists are well aware of the media ecosystem and construct rhetorical arguments to give the appearance of legitimate debate where none really exists. They appeal to journalistic practices to cover both sides of an issue, to include diverse perspectives, and to cover a controversy. The University of Wisconsin and the National Association of Science Writers have organized a meeting, Science Writing in the Age of Denial (http://sciencedenial.wisc.edu; Figure 2). The meeting will address the challenges journalists face with science denial, such as reporting in a politicized climate, journalistic ethics, false balance in reporting, persuasive writing, and covering controversy. Video recordings of the talks will be available from the website (at the time of this writing, the April meeting has not yet taken place).

    Figure 2.

    Figure 2. The University of Wisconsin is hosting a meeting on Science Writing in the Age of Denial that convenes journalists and scientists to explore the challenges of good reporting in the face of denialist tactics and a politicized climate.

    The topic of climate change has become especially challenging for journalists. A number of websites have been developed to disseminate information to reporters and other highly engaged audiences. Climate Central (www.climatecentral.org) is a research and journalism organization established to provide clear and up-to-date information concerning climate and energy. The site offers high-quality content that includes news bulletins, interactive graphics, special reports, and videos. The online magazine Grist (http://grist.org) is devoted to independent green news and has a section devoted to climate and energy. American University's School of Communication provides information and effective communication on climate science on a website called Climate Shift (http://climateshiftproject.org). These websites devoted to better science communication allow students to see what good-faith argument looks like in contrast with denialism. For example, Chris Mooney, a blogger for Discover magazine (blogs.discovermagazine.com/intersection/2011/04/21/false-balance-in-matthew-nisbets-climate-shift-report) has accused Matthew Nisbet of Climate Shift of phony balance. These two are engaged in a healthy debate between individuals striving for genuine balanced reporting and differing on where to draw the line, irrespective of personal opinions and bias. A recent editorial published in the Wall Street Journal highlights the challenge of conveying to a lay audience the nature of scientific skepticism (http://online.wsj.com/article/SB10001424052970204301404577171531838421366.html). The editorial, signed by 16 individuals with various climate science credentials, employs nearly all of the tactics of denialism, including comparing the consensus view on global warming with Soviet-era Lysenkoism. At the time of this writing (March 2012), 2850 comments have been posted in response to the editorial. William Nordhaus responded to a misleading description of his own work in the editorial by writing “Why the Global Warming Skeptics Are Wrong” in the New York Review of Books, (www.nybooks.com/articles/archives/2012/mar/22/why-global-warming-skeptics-are-wrong).

    Denialists often misappropriate scientific skepticism. Climate scientist Richard A. Muller recently became a famous climate change skeptic (see www.scientificamerican.com/article.cfm?id=i-stick-to-science) when he was recruited to testify before Congress with expectations that he would support climate change denial. Muller had called Al Gore's An Inconvenient Truth a pack of half-truths and had expressed concerns about the quality of temperature data used to compile warming trends. He stunned the House Committee when he testified that his independent temperature measurement research confirmed the data about which he had originally been concerned. Muller showed Congress the colors of a true scientist, skeptical and grappling with evidence, but never denying evidence, even if it is contrary to his personal leanings. You can read his testimony on the Scientific American website (www.scientificamerican.com/article.cfm?id=muller-hearing). Two other prominent global warming skeptics include Anthony Watts and his blog Watts Up With That (http://wattsupwiththat.com) and Steve McIntyre of the Climate Audit blog (http://climateaudit.org). Watts and McIntyre characterize themselves as skeptical on some climate change issues, and Muller agrees that they are skeptics not deniers. Unfortunately, the tone of some of their blog posts sound denialistic. Watts's blog, for example, has a posting reacting to Nordhaus's response to the Wall Street Journal editorial (wattsupwiththat.com/2012/03/03/why-william-d-nordhaus-is-wrong-about-global-warming-skeptics-being-wrong). It is important that students understand the difference between the essential skepticism that all good scientists need and being a denialist or a knee-jerk contrarian. The reactions to skeptics who part from denialist camps are so strong that the skeptics are often denigrated as apostates. Richard Cizik, president of the New Evangelical Partnership for the Common Good (http://newevangelicalpartnership.org/?q=node/6), dramatically resigned his post at the National Association of Evangelicals. The proximal cause for his departure was comments he had made on shifting views of same-sex marriage, but he had been embattled for years over his views on environmental stewardship and accepting evidence for global warming.

    One of the best ways to learn how to distinguish between a denialist and an honest skeptic is to delve into a case history. We are fortunate to have a tour de force case study in the book Merchants of Doubt by Naomi Oreskes and Erik M. Conway (Figure 3). The authors explain “how a handful of scientists obscured the truth on issues from tobacco smoke to global warming.” The behavior and tactics of these scientists embody the worst aspects of denialism. This book documents the activities of individuals who were motivated by an ideological perspective that led them to betray core scientific values, believing their ends justified denialist means. Merchants of Doubt is an enlightening case study and a stellar example of rigorous scholarship in science and history. The companion website (www.merchantsofdoubt.org) is especially valuable for the list of Key Documents it provides, as well as reference websites under the Resources tab. Students could conduct their own independent scholarship and draw their own conclusions as part of course assignments.

    Figure 3.

    Figure 3. The book Merchants of Doubt is a masterly work of science, history, and investigative reporting on the role of some scientists in denying global warming and other issues. The companion website has excellent primary and secondary resources relating to the authors' voluminous research.

    The typical citizen is a victim, not a perpetrator, of denialist tactics. Scholars from multiple disciplines have been working to better understand how the public receives messages with scientific content. How does the average person understand uncertainty? How do temperament and cultural affinities affect perceptions? Anthony Leiserowitz, director of the Yale Project on Climate Change Communication, has produced an outstanding 5-min video Global Warming's Six Americas (http://environment.yale.edu/profile/leiserowitz/multimedia/anthony-leiserowitz-on-global-warmings-six-americas). His research reveals that Americans fall into six categories when it comes to global warming: alarmed (18%), concerned (33%), cautious (19%), disengaged (12%), doubtful (11%), and dismissive (7%). He notes that the “alarmed” individuals are a highly engaged and motivated “issue public” and that nearly one in five people being alarmed is a significant proportion of the public. While the “dismissive” group is less than half as prevalent, they are also highly motivated and voluble. Encouragingly, people in all of the categories support changes to energy policy. What is the best way to shape global warming messages to reach the 75% in the middle? Leiserowitz was formerly with Decision Research, an independent research organization that studies how people perceive risk and make decisions. For example, Decision Research has a number of projects in the area of risk perception and communication (www.decisionresearch.org/research/risk), as well as applied research projects on the environment (www.decisionresearch.org/research/environment) and medical decision making (www.decisionresearch.org/research/medical).

    A perceived conflict between religious beliefs and scientific information is the prime motivation for denying biological evolution. However, the motivations for denying global warming and vaccine usage are less obvious. One might think that everyone would be interested in honest assessments of various risks. Trying to understand why people want to deny particular findings of science has led to an entire area of study termed “cultural cognition.” In a nutshell, cultural cognition refers to the influence of group values on individual perception. Dan Kahan and Donald Braman are leading scholars in the Cultural Cognition Project (Figure 4). Their website (www.culturalcognition.net) includes a syllabus for undergraduate and graduate courses (www.culturalcognition.net/teaching). It is fascinating to browse the project pages (www.culturalcognition.net/projects), which include studies on attitudes toward HPV vaccination, nanotechnology, and gun regulation. A core finding of their research is that “citizens experience scientific debates as contests between warring cultural factions” (see www.culturalcognition.net/browse-papers/fixing-the-communications-failure.html).

    Figure 4.

    Figure 4. The website of the Cultural Cognition Project has a number of excellent resources for exploring the field of cultural cognition, which is concerned with understanding how affinity groups affect perception and decision making.

    Group affinities and cultural background are not the only factors that influence how people perceive risks and make decisions. There are fundamental aspects of human cognition that influence decision making. Consider this simple math problem: a bat and ball cost $1.10 in total; the bat is $1 more than the ball. What is the cost of the ball? If you casually and quickly thought “a dime,” you are in good company—over half of 3500 individuals sampled at eight prestigious universities gave that answer. If you paused first, and decided to fully engage your rational mind instead of using intuition, you easily solved the math and came up with the correct answer, a nickel ($1.05 + $0.05 = $1.10). The key is getting people to slow down and activate their rational system, especially when facing a horde of overriding cues. The Harding Center for Risk Literacy at the Max Planck Institute for Human Development studies decision making with the hope of helping people learn how to make less biased decisions; their particular focus is on health issues. The Harding Center offers a quiz to assess your risk literacy (www.harding-center.com/what-you-should-know). Getting people to consciously think about their own attitude toward assessing risks is an important step toward rational decision making. Researchers agree that poor decision making is rampant, but they are divided over how to fix this problem. The Harding Center promotes a curriculum for teaching statistical thinking to young children (see www.nature.com/news/2009/091028/full/4611189a.html). To ensure that the best evidence is used to make healthcare decisions, the Cochrane Collaboration for healthcare (www.cochrane.org) provides easy access to the best evidence for good healthcare practice. Perhaps similar projects could be adopted for a variety of scientific topics to help better discern and emphasize scientific consensus.

    In closing I would like to mention Ignaz Semmelweis, the nineteenth-century obstetrician from Budapest. Semmelweis began practicing obstetrics in Vienna's free maternity wards in 1846. He was shocked at the high incidence of death among mothers following birth, recording 36 deaths among 208 mothers in his first month working at Ward 1. Upon investigation, he learned that the death rate at Ward 2 was much lower and that poor women preferred to give birth in the streets rather than go to Ward 1. A common explanation among nurses was the existence of a poisonous gas in Ward 1. Semmelweis suspected the high mortality to new mothers might have more to do with the fact that Ward 1 also had a morgue and that doctors performed both autopsies and births. Ward 2 had only midwives and performed only births. Semmelweis suspected the physicians were transferring something harmful from person to person. He ordered all medical staff to wash their hands with chlorinated lime water and for the ward to be scrubbed with calcium chloride. Within several months, he managed to reduce the death rate to a negligible level. The medical establishment of the day refused to accept Semmelweis's published findings, despite the success of his methods at multiple sites. The tragic end of his life was influenced by his peers rejecting Semmelweis' sound conclusions (www.historylearningsite.co.uk/ignaz_semmelweis.htm). The Wikipedia article on Semmelweis is particularly good and includes a lot of data from Semmelweis's studies and publications (en.wikipedia.org/wiki/Ignaz_Semmelweis).

    In Semmelweis's day, many doctors were offended at the suggestion that they might carry diseases and should wash their hands. Today, we are fortunate that we understand the germ theory and that medical practice is generally grounded in sound science. Nevertheless, some physicians practicing in the twenty-first century suffer from a perception deficit when it comes to hand washing. Steven Levitt and Stephen Dubner, known for their Freakonomics book, website, and radio programs, which explore economics from interesting angles, including the difference between people's perception of their behavior and their true actions, presented information about physicians and hand washing; you can view a short video to appreciate their analysis (www.youtube.com/watch?v=AEkOmn5hjFU). In one study, doctors reported washing their hands 73% of the time, while nurses observed the doctors washing only 9% of the time, a shocking disparity. Freakonomics reports on a successful campaign waged by Cedars-Sinai Medical Center in Los Angeles to improve hand-washing compliance by using humorous posters and screen savers throughout the hospital (www.freakonomics.com/2012/01/24/how-to-get-doctors-to-wash-their-hands-visual-edition).

    Hand washing and a perception deficit among physicians brings us full circle back to cultural cognition. The Joint Commission Center for Transforming Healthcare has found that the only sustainable way to enforce good hand-washing practice at hospitals is to instill a culture in which appropriate hygiene is the norm. Various campaigns, including frequent email reminders or posted signs, for example, can help but are not sufficient. Only constant reinforcement through interaction with peers results in lasting solutions (www.centerfortransforminghealthcare.org/projects/projects.aspx). In case you were wondering, indeed, nurses comply better with proper hand-washing regimens than physicians (www.nursingtimes.net/hand-hygiene-compliance-exploring-variations-in-practice-between-hospitals/1944149.article).

    In this feature, I began by considering organized and intentional denialism, about which every honest scientist and educator must be concerned. The denial of evidence strikes at the very heart of what science is and why we do research. In a politically charged atmosphere, it is important to step back and consider the many factors of cultural cognition and the psychology of decision making that influence denialism. It is encouraging to see the popularity of books such as Malcolm Gladwell's Blink (see www.gladwell.com/blink) and Daniel Kahneman's Thinking Fast and Slow (see www.brainpickings.org/index.php/2011/10/26/thinking-fast-and-slow-daniel-kahneman), which dissect our faulty decision-making faculties. It appears that the general public is interested in how the human brain works in order to improve personal and societal decisions. However, the consensus of expert opinion is that changing behavior and improving decision making is very difficult. One of the privileges of being an educator is that your students come with minds that are open to learning something new, such as an understanding of how science works and its role in society. Students could gain an appreciation of science by looking at some of the denialism materials presented in this review. You may be aware that there is a push to implement common education standards across the United States that is gaining ground state by state (www.corestandards.org). Nonfiction science reading has a prominent place in the English language arts standards, which include emphasis on science literacy. Teachers and students who recognize the role of science in our society should be able to recognize a denialist tactic when they see it.