ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .
Sign up for the free Nautilus newsletter:
science and culture for people who love beautiful writing.
NL – Article speedbump
Explore

One question for Daniel Williams, a philosopher at the University of Cambridge who studies recent advances in psychology to understand how various forms of irrationality and bias are socially adaptive.

Nautilus Members enjoy an ad-free experience. Log in or Join now .
In Body Image
Courtesy of Cambridge University

What Is misinformation doing to us?

Since 2016, there’s been a growing panic about how much misinformation is affecting what people believe and do. But what I argue in a recent paper, published in Economics & Philosophy, is that the panic around misinformation is misguided. The share of misinformation in most people’s information diet, in countries like the United States and the United Kingdom and in Northern and Western Europe, is pretty negligible. (There’s much less research when it comes to other countries around the world.) The minority of the population that tends to consume a lot of misinformation tend to be people who are already extremely partisan or dogmatic on certain issues anyway, which suggests that the information is not really changing their behavior. 

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Before there’s any kind of policy-level decision making when it comes to things like censorship and banning, it’s really crucial to get that causal arrow correct: Is it the case that people are being misinformed by the misinformation, or are they seeking out evidence and arguments to rationalize those beliefs? People tend to be pretty vigilant when it comes to acquiring information from different sources—if anything, relying too much on their own intuitions than they do from information acquired from other people.

In his great book Not Born Yesterday, Hugo Mercier goes through an enormous amount of evidence showing how sophisticated people are when it comes to evaluating information that they encounter. The first thing people do, not always consciously, is a kind of plausibility checking. They also ask, “Can I hold this person accountable if they misinformed me? Have they got good arguments? Do I have good reason to believe that they’re a trustworthy source?” All of these different cues people use to weigh up the reliability of information. But the catch is they’re only vigilant in that way when their aim is to acquire accurate beliefs.

When, on the other hand, they’re engaged in motivated reasoning, when they’re motivated to form beliefs because it’s favored by their in-group, say, then people tend to be much more receptive to information if it confirms and rationalizes their favorite narrative. This is where the idea of a marketplace of rationalization comes in. This is any kind of social system in which certain individuals or firms stand to benefit either financially or socially from producing and disseminating information, not really to inform people but to rationalize what people are motivated to believe. There’s a widespread demand for rationalizations of the narratives of different political, cultural, and social groups. Certain ambitious media companies or people on social media stand to benefit from churning out intellectual ammunition conducive to justifying these favorite narratives. 

Insofar as people are being “gullible,” it’s almost a strategic gullibility, whereby they’re letting their guard down to accept information because it supports and rationalizes what they’re motivated to believe. It’s not that people are pursuing the truth and then being duped by propaganda and demagogues—although again, I don’t want to say that never happens. For the most part, it’s that people have motivations to view the world in a particular way. And they’re highly receptive to, and indeed seek out evidence and arguments which support, that preferred way of seeing the world.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Membership in crosscutting communities, and also reducing your antipathy to those who belong to different communities—these will really weaken the motivation to engage in what the social scientist, Dan Kahan, called identity-protective cognition, where roughly speaking, you are prioritizing your attachment to a particular social group over forming accurate beliefs. I don’t know how optimistic we should be that people will actually pursue that strategy because especially in a country like the U.S., there is such intense political and cultural polarization. And one of the things that’s difficult about motivated reasoning is that it never feels like you’re engaged in it when you are.

Lead image: Pavlo Plakhotia / Shutterstock

close-icon Enjoy unlimited Nautilus articles, ad-free, for less than $5/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.