Authors Sara and Jack Gorman provide a rubric for combating science denial in a revised and updated edition of “Denying to the Grave,” published by Oxford University Press. The Gormans are a father-daughter collaborative team who divide the world into science-believers and science-deniers. By science-believers they mean those who live by a hypothesis and evidence-based credo. The deniers find numerous faults with science as presented to them. Science-deniers have great impact: Witness our vaccination wars today, and in 2014, the Ebola panic in the United States — an unjustified panic, in the Gormans’ view.
The Gormans implore scientists to combat science-denial with informed engagement. Their call for a “corps of scientific first responders… to mitigate the effects of misinformation” is based upon an understanding of the six key factors they delineate as the foundations for denial — the reasons why, all of us, can at times turn away from scientific understanding when we make decisions:
Conspiracy theories
Science today is seen by deniers as a corporate conspiracy. Science is maligned by science-deniers as being manipulated by self-serving, conflicted “scientific leaders”— doctors who are in the pocket of the medical industrial complex — “big pharma.” Science-believers perceive that charismatic leaders have seized the narrative, manipulating information and people, to dominate those who feel powerless and vulnerable, in order to advance a self-serving agenda. The Gormans recommend that scientists understand the who and the why of conspiracy theories, and then work to reset the narrative to one that is science-forward.
Charismatic leaders
The Gormans offer case studies to highlight the potential and the power of opportunistic and arguably predatory “leadership” in the science-denier community. They discuss Andrew Wakefield (linking vaccines and autism), Jenny McCarthy (opposed to vaccines), Gilles-Eric Séralini (anti-GMO — genetically modified organisms), Wayne LaPierre (National Rifle Association), Joseph Mercola (espousing his own “alternative medicine” approaches to healing) and Donald Trump (on various scientific areas), demonstrating the wide range of personalities and approaches in rejecting the role of science in decision-making. In response to these charismatic deniers, the Gormans argue that it is possible to resist such persuasion through intentional cognitive and communication strategies that are not “dry, pedantic harangues” to the public, but rather are appeals to emotion as much as to reason.
Confirmation bias
According to the Gormans, the “tendency to attend only to information that agrees with what we already think is true” is an active process that serves to allow people to ignore information that does not comport with their (often mistaken) beliefs and allows for the perpetuation of myths. Fighting this bias requires going beyond the facts, and the Gormans recommend iterative thinking and planning to combat science-deniers’ flawed arguments before messaging — as well as avoiding didactic, numbing lectures.
Causality
Not knowing the details behind phenomena sets us all up to make false connections, such as attributing Acquired Immunodeficiency Syndrome (AIDS) — when it first appeared — to the lifestyles of affected individuals. Here, the Gormans review Karl Popper’s foundation of scientific hypothesis testing: that disproving a theory is a more valid test of truth than proving a theory. After discussing the details by which scientists prove causality, the Gormans recommend open-ended conversational approaches with scenario-testing, to convince those individuals who are “on the fence.” These individuals, who comprise the majority of science-deniers, are more likely to be open to learning, as opposed to those who are attached fervently, and emotionally, to the science-denying charismatic leaders.
Avoidance of complexity
While not many of us today bemoan the good old days of expansive and sophisticated presidential speeches, we all recognize that our sound-bite society and Twitter-based communications sent by our elected leaders leave an exploitable gap in information about complex and nuanced issues. It may be that modern communications can be blamed on the media’s desire to sensationalize, or because humans are “wired” to choose simple information over complex information. The Gormans refer to the work of Daniel Kahneman and Amos Tversky, founders of behavioral economics, arguing that “making reasoned choices is energy-consuming and tiring.” What can we do, then? Teaching the scientific method to the public, and coaxing deniers to see the scientific approach, perhaps through “motivational interviewing” approaches, may help us to message science more effectively, and thus help the public better vet the flurry of information it receives.
Risk perception and probability
“Skewed risk perception” was demonstrated by the delayed U.S. national adoption of masking in response to COVID-19. Two years into the pandemic, this issue continues to be at play — seen both in the population who advocate against vaccines and those in favor of personal firearms. The Gormans present a detailed discussion of risk perception theory, exploring the roles of social context and behavioral economic principles in our perception of risk. They ask us to integrate into our health care decision-making the difficulty we all have in “linear risk perception.” To this end, they recommend that we should “in a risk-communication setting (such as an epidemic)…have a two-way dialogue until common ground is established.” They do note that this is not our usual means of communicating public health information. And we all recognize how difficult this is in the currently polarized socio-political climate in the United States.
In their prescription for combating denial, the Gormans stress that “giving the facts is a necessary, if insufficient, component of any attempt to debunk myths about health and science.” This is difficult when politicians have usurped the role of scientists, and when scientific research is woefully underfunded. The Gormans’ eponymous method for restoring scientific thinking in decision-making requires acknowledgement of seven guiding principles:
It is not simply uneducated people who make irrational heath decisions.
It isn’t about a simple “lack of information.”
Empathy and evolutionary benefits may sometimes be at odds with rational thinking.
Hypothesis testing lends itself to an inability to profess absolute certainty, and people are uncomfortable with this.
People respond more to emotion and statistics, but charismatic leaders use emotion and scientists use statistics.
People have trouble changing their minds.
People have trouble understanding probability and risk.
Their solutions are, as paraphrased here, that:
Science must deal with increased access to various types of information via the internet.
Science must use evidence-based methods to counteract misinformation.
Members of the media need to be better trained to understand what a valid scientific debate is, and what it is not.
Scientists must be more sensitive to difficulties in communicating causality, peoples’ discomfort with uncertainty, and their own weaknesses.
Better childhood education is needed — about statistics, the scientific methods, development of critical thinking and the assessment of evidence.
Health care professionals need to engage in motivational interviewing with people who have incorrect medical beliefs.
We must all examine our tendency to think uncritically, and to place emotion over reason.
How does this impact our world of medical regulation? We are the vanguard for how the public perceives practitioners. We cannot ignore our duty to educate. We must demonstrate clarity, fairness and the necessity of our standards — to legislators, the public, the media and to each other — as well as to the providers we regulate. In addition, conflicts of interest must be addressed transparently so that stakeholders can see, and develop confidence in, the capacity of medicine to self-regulate. Failing to do so will open the door to having oversight imposed on us, potentially by science-denying “charismatic leaders.” Most importantly, this book offers a clarion call to the medical profession: to engage, beyond the walls of medicine, with the public, with the media and with government. Intentional and informed approaches that incorporate data as well as emotion are required to guide us all through the challenges that we face today in health care.




