Facts So Romantic

How Uncertainty Can Help Fight Science Denialism

Forked path
Shutterstock

Why is a statement like “vaccines cause autism” persuasive or not? Each side of the issue will no doubt claim some support, but if we know anything about psychology, it’s that facts don’t always settle an argument. Those who claim a link between vaccines and autism—without any evidence to support this claim—are just as certain as those who discredit it.

Communication researchers have been tackling this question for decades. How can two people look at the same information and come to radically different conclusions? How can a single retracted study by a disgraced and dismissed doctor be garbage some and gospel for others? Three decades ago scientists came up with one model to try and explain it. They split our cognition in two.

In the early 1980s, Shelly Chaiken spearheaded the development of a communication model that quickly began to dominate the field, not just because it considered the relationship between medium and receiver differently (it did), but because it was a model of communication that looked at how people really think. And how we think is rather simple, according to the model. We are all cognitive Scrooges. You might even call it the law of conservation of mental effort. Like physical effort, you can expend a lot or a little, and how much you do spend determines what outcome you get. But it’s always easier to expend less. Chaiken’s model used this mental economy to categorize our thinking into heuristic and systematic systems. Heuristic thinking—utilizing quick and dirty rules and cues—expends little effort. Information is processed relying on superficial characteristics, like academic degrees of the author. But this is far from a “lazy” approach to thought. Without heuristics, you would be lost in thought when just deciding how much cream to put in your coffee. Heuristic processing is efficient processing, even if it can lead to superficiality.

Three decades ago scientists came up with one model to try and explain people’s vastly different interpretations of facts. They split our cognition in two.

Systematic thought, on the other hand, is an in-depth look at the evidence. An article isn’t taken as fact simply because it was written by a doctor. The references are checked; the arguments are evaluated. But there is a trade-off to this Sherlock-like approach. It’s hard to find the resources in our cognitive economy to think deeply about everything all the time. In reality, we constantly switch between heuristic and systematic processing depending on the task, and both styles can occur simultaneously. (A similar dual-process model of human thought has also been widely popularized as “system 1” and “system 2” thinking by Nobel laureate Daniel Kahneman.)

We all have the ability to think in both styles, but what pushes us to do so is different for everyone. Chaiken’s “heuristic-systematic model” [pdf] says that uncertainty determines our thinking style in any one situation. Uncertainty—measured as the difference between what you know and what you think you need to know to make a confident judgment—drives processing. The bigger the gap, the harder you have to work to close it, and the more you’re pushed toward systematic processing. If you think you already know everything you need to, why spend much effort evaluating a message? A small gap predicts heuristic processing.

Thinking about our thinking in this way begins to sort out why a pro-vaccine message can be so ignored (or an anti-vaccine message so persuasive). A person who is anti-vaccine might believe that he already knows all he needs to about vaccines (e.g., they are harmful). Why would we expect someone to then deeply process a scientific study or message saying the opposite? Superficially he could see the pro-vaccine leanings, and the rest is ignored, discredited, or misconstrued.

There’s a solution hidden in the model too. To encourage more in-depth evaluations, the communicator should try to tiptoe around grand emotional appeals and instead emphasize the importance of accuracy in making a judgment on the question. If the listener believes that there is a large gap between current and required knowledge on a topic, they are more likely to exert the large effort required to close it. No psychologist would guarantee that a person thinking systematically will reach a scientifically correct solution, but the train must first be on the track.

We might not want to admit that things like attractive graphics or author authority easily persuade us, but they do. We have to have convenient mental bridges to span the information deluge of everyday life. And it can be frustrating to admit that even with all the processing power in the world, you can still come to the wrong conclusion. We bias, we discredit, we blindly endorse, we conflate, and we often simply don’t know enough. This happens no matter what cognitive style you engage in.

Ultimately, persuasion isn’t an outcome; it’s a process. Anti-vaccine views don’t just appear; they’re driven there by the same uncertainty and mental economy that influence what news anchor you trust or what kind of car you buy. To get more people to believe in good science, we have to better use the science of persuasion. 


Kyle Hill is a freelance science writer who manages the Scientific American blog Overthinking It, and contributes to Slate, Wired, Popular Science, Skeptical Inquirer, and io9. He is a research fellow with the James Randi Educational Foundation, and you can follow him on Twitter under @Sci_Phile.


Most Recent Entries

See All

Current Issue

See Full Issue

Cover_THUMB
2 Comments - Join the Discussion