The world brims with wonders and puzzles and pressing concerns. Scientists of all stripes seek these out, following their curiosity where it leads. The people who read about their discoveries increase their knowledge—about sea-level rise, the consequences of a Western diet, how the coronavirus stays one step ahead of us, you name it—and society, to some extent, hopefully benefits.
The trouble is, popularizing science has tradeoffs. New research, published in Nature Human Behaviour suggests a little bit of knowledge about science can be a dangerous thing, leading to overconfidence in one’s understanding and under-confidence in the problem-solving powers of science. It supports prior research indicating that popular science media “inclines laypeople to underrate their dependence on experts.”
Our confidence grows in weird non-linear ways.
Joana Gonçalves de Sá, a Portuguese computational social scientist at Nova University Lisbon, led the research, which was based on an analysis of four large surveys of general scientific knowledge in the United States and Europe that spanned 30 years. She and her colleagues found that the people most likely to be engaging with popular science have a fair amount of scientific knowledge—but are also the most overconfident. In other words, they’re ironically the least aware of how limited their knowledge is.
Notably, this is contrary to the well-known Dunning-Kruger effect—the finding that the least competent people, in a range of domains, tend to overestimate their abilities the most. Gonçalves de Sá and her coauthors found instead that it’s those with an intermediate amount of scientific knowledge that overrate what they know the most. What’s more, having an intermediate level of knowledge was associated with more negative attitudes toward science, like endorsing the idea that “scientific and technological research cannot play an important role in protecting the environment and repairing it.”
Gonçalves de Sá tells Nautilus that the new findings also suggest that our confidence grows in weird non-linear ways, at least when it comes to what we think we know about science.
I would have guessed that the more you know about science, the better you are at judging how much you don’t know. But you found that that’s wrong. How’d you do that?
We looked at about 90,000 science knowledge questionnaires done in Europe and the United States over 30 years. We came up with this metric that did not rely on people’s subjective assessments of their own knowledge. The idea was, if I asked you any scientific knowledge question, and you know the answer, you should give me the correct answer. If you do not know, you should say, “Don’t know.” By using the ratio of incorrect to correct answers on these questionnaires, we could have an idea of how overconfident people are: The more people entered incorrectly, the more overconfident they were. People who have perfect metacognition, meaning they are 0 percent overconfident, will either answer correctly or with “Don’t know.” We saw that people on the very low knowledge levels and people on the very high knowledge levels were more likely to enter “Don’t know.” And people in the intermediate knowledge levels are more likely to answer incorrectly.
That goes against the logic of the Dunning-Kruger effect.
Right. This idea of the Dunning Kruger effect, of being unknowledgeable and unaware—we don’t see it. The ones who have some knowledge are the ones who become unaware really fast on how limited their knowledge is. It’s as if knowledge grows slowly and linearly, and then confidence grows exponentially: You get a little bit of knowledge and a lot of confidence, and then a little bit of knowledge and even more confidence. So then there is a big gap for these intermediate knowledge levels that we don’t see at the beginning levels.
Why does our confidence about what we know grow faster than our actual knowledge?
Ha! I don’t know. If we were 100 percent rational, like some robot, we would adjust confidence to the knowledge level, and there would be no overconfidence.
What sorts of people in the intermediate knowledge group are the most overconfident? Men?
Correct. Men are more likely to never say “Don’t know.”
What does all of this say about science journalism and other efforts to communicate scientific discoveries?
We are explicitly told by our science communication officers to remove most of the complexity, so that we make the scientific discourse easy to apprehend, as well as interesting, and engaging.
So one hypothesis we discussed in the paper is that by doing this—by lowering the level of complexity to increase how much people understand about what scientists do—we are creating a society that is not very knowledgeable, but very confident. It’s easy to think, “Oh, OK, I got it. This wasn’t so difficult after all,” and then not understand all of the intricacies, all of the hypothesis testing, all of the things that didn’t go right.
They make the jump into thinking, “I understand as much as the researchers.”
Lead image: YummyBuum / Shutterstock