I can remember, almost a decade ago, when I was convinced out of my “Young-Earth” Creationism. It was almost a process of de-radicalization. During high school I was a generic Christian, but then some friends suggested I watch a video of a pastor online and, well, you can guess the rest. The message encouraged spreading skepticism of evolutionary biology, geology, and cosmology—the fields which most directly contradicted the Christian fundamentalist worldview. It was this evangelism, though, that ultimately helped undermine that belief. It later led me, in my freshman year of college at U.C. Santa Barbara, to enroll in an intro to geology course—I thought I’d see through the arguments in favor of an Old Earth and maybe gain some converts. But my idea that Earth is 6,000 years old didn’t survive scrutiny. With reason and evidence, my professor changed my mind.
It’s not easy changing someone’s mind, especially if what you’re trying to change is something like their settled opinion. Only rarely does persuasion succeed in replacing one belief with its opposite, even among scientists. As the late philosopher of biology David L. Hand once wrote, “The objectivity that matters so much in science is not primarily a characteristic of individual scientists but of scientific communities. Scientists rarely refute their own pet hypotheses, especially after they have appeared in print, but that is all right. Their fellow scientists will be happy to expose these hypotheses to severe testing.”
When you’re persuaded, though, it can be memorable. The feeling of having your view change when you didn’t want it to, or weren’t expecting it to, is, at first, a little disorienting, like putting on a new pair of strong prescription glasses. But you quickly find that you appreciate the resulting clarity. Of course, it’s possible to be persuaded by a falsehood and yet feel in the right. In his novel Slouching Towards Kalamazoo, Peter de Vries pits the town atheist against the town priest in a debate. So persuasive was the atheist to the priest—and the priest to the atheist—that each adopted the other’s position, and found themselves unable to change their minds back again.
“To the extent that we can learn it, we acquire a superpower of sorts.”
If this amusing scene can be called anything, you might say that it’s a parody of intellectual humility or honesty. From it, you could take away the lesson that it’s possible to be too open-minded or too lacking in conviction, in one moment, and then too stubborn or narrow-minded in the next. Still, in these times—post-truth, post-fact, whatever you want to call this uber-polarized, choose-your-own-reality moment—maybe we could do with more intellectual humility, and a better understanding of it. “Not being afraid of being wrong—that’s a value, and I think it is a value we could promote,” Mark Leary, a Duke University psychologist and neuroscientist, said. “If you think about what’s been wrong in Washington for a long time, it’s a whole lot of people who are very intellectually arrogant about the positions they have, on both sides of the aisle.”
Leary’s the author, along with some colleagues, of a new paper, titled “Cognitive and Interpersonal Features of Intellectual Humility.” The trait, they found, is “associated with variables related to openness, curiosity, tolerance of ambiguity, and low dogmatism.” Naturally, people high in intellectual humility expressed less certainty in their beliefs about religion and gave less weight to religious opinions in judging others; they also were less likely to deride politicians who change their positions as “flip-floppers.” What’s more, the researchers showed that people high in intellectual humility also were “more attuned to the strength of persuasive arguments than those who were low.” So intellectually humble people aren’t just humble because they’re bad at evaluating arguments.
Leary’s and his colleagues’ findings line up with prior philosophical analyses of the trait. In a 2015 paper, a group of philosophers described intellectual humility as “proper attentiveness to, and owning, one’s intellectual limitations.” Would you say you’re intellectually humble? You may be in a minority. To the philosopher and neuroscientist Sam Harris, that sort of person seems hard to come by. In response to Edge’s 2017 question, “What scientific term or concept ought to be more widely known?” he offered “intellectual honesty.” He thinks many of us find it hard attending to and owning our intellectual limitations. But “to the extent that we can learn it, we acquire a superpower of sorts,” he wrote. “In fact, a person who surrenders immediately when shown to be in error will appear not to have lost the argument at all. Rather, he will merely afford others the pleasure of having educated him on certain points.”
One way to cultivate more intellectual humility is to acknowledge how hard it is to really know something—especially when it comes to other people. When we asked the social psychologist Nicholas Epley whether there had been, during his student years, a seminal book in his life, he said, “Unquestionably. It’s Thomas Gilovich’s book, How We Know What Isn’t So. Tom’s interest was in understanding the difference between perception and reality, usually at the level of the individual judgment. How we might make a decision that’s mistaken or misguided in some way; how we might have a belief about the way the world works and it’s just not quite right…There’s nothing that we spend more time thinking about in our daily lives than other people. Other people are the most complicated things we ever think about.”
Brian Gallagher is the editor of Facts So Romantic, the Nautilus blog. Follow him on Twitter @bsgallagher.