“You will become way less concerned with what other people think of you when you realize how seldom they do.”
—David Foster Wallace, Infinite Jest
You and I are members of one of the most social species on the planet. No human being succeeds in life alone. Getting along and getting ahead requires coordinating with others, either in cooperation as friends, spouses, teammates, and coworkers, or in competition as adversaries, opponents, or rivals. Arguably our brain’s greatest skill is its ability to think about the minds of others to understand them better. Our daily lives are guided by inferences about what others think, believe, feel, and want. This is your sixth sense at work.
But over the past two decades in my research as a psychologist, my experiments and research from many other scientists demonstrate the ways in which our sixth sense works well, but not nearly as well as we might think. The truth is that you are likely to understand much less about the minds of your family members, friends, neighbors, coworkers, competitors, and fellow citizens than you would guess.
One of the biggest barriers to understanding others is excessive egocentrism. You can’t see into the mind of others because you can’t get over yourself. You can’t overcome your own experiences, beliefs, attitudes, emotions, knowledge, and visual perspective to recognize that others may view the world differently. Copernicus may have removed the Earth from the center of the universe, but every person on this planet is still at the center of his or her own universe.
One consequence of being at the center of your own universe is that it’s easy to overestimate your importance in it, both for better and for worse. Consider a classic psychology experiment that asked married couples to report how much each of them was personally responsible for a variety of household activities. These included relatively desirable tasks, like cleaning the house, making breakfast, and resolving conflicts, but also undesirable actions, like messing up the house, irritating their spouse, and causing arguments. Husbands and wives were separated from each other and then asked to indicate, out of the total amount for each activity, what percentage they were personally responsible for. The researchers then simply added the spouses’ estimates together for each item. Logically, this sum cannot exceed 100 percent. If I claim that I make breakfast 80 percent of the time and my wife claims that she makes breakfast 60 percent of the time, then our kids are apparently eating breakfast 140 percent of the time. Not possible, even for the fattest families. But psychologically, if I can think of the times I made breakfast more easily than the times my wife made breakfast, then by extrapolation, there will be a lot of reportedly overstuffed families out there.
This is exactly what the results showed. The couples’ estimates, when added together, significantly exceeded 100 percent. Surely you’ve experienced this self-centeredness before. It’s even made its way into jokes. Here’s one: What’s a woman’s definition of barbecue? Answer: You bought the groceries, washed the lettuce, chopped the tomatoes, diced the onions, marinated the meat, and cleaned everything up, but he “made the dinner.”
Don’t get too smug. The really interesting result is that researchers find more consistent evidence for overclaiming, albeit to a lesser extent, for negative activities as well. In this experiment, spouses tended to claim more responsibility than is logically possible even for activities like causing arguments that occur between the two of you. Being self-centered also means being uniquely aware of your faults and shortcomings, knowing when you needled your spouse after a tough day at work or broke the dishes and scuttled them into the trash can before anyone else noticed. It’s harder to notice your spouse’s bad intentions or unhandy dish work.
Consider participating in what I think is one of the most liberating experiments ever conducted—the Barry Manilow experiment.
This finding can be surprising. When researchers asked married couples to predict how much responsibility their partner would claim for positive and negative activities, they found very cynical predictions but again found very self-centered overclaiming. Spouses tended to assume that their partner would vainly accept credit for all of the desirable activities in their marriage but deflect blame for the undesirable activities. In fact, partners were again egocentric, tending to claim more responsibility than is logically possible for all activities, positive as well as negative. It’s not hard to imagine how both overclaiming and overly cynical assumptions about overclaiming can lead to unwarranted discord in almost any relationship.
As consistent as self-centered overclaiming in marriages may be, it’s likely to be even stronger outside of them, as work groups and teams get larger. In a marriage, overcoming egocentrism requires thinking of only one other person. That’s not too hard. But as a group’s size increases, the number of others you could be overlooking also increase, and so, too, do the consequences of being egocentric.
Consider an experiment Eugene Caruso, Max Bazerman, and I conducted with 699 Harvard MBA students. These students worked together in the same study groups for the entire two years of their MBA program, doing everything involved with their coursework together. The groups generally ranged from four to nine students. We asked some of these MBA students to report what percentage of the group’s total work they personally contributed. We found that the amount of overclaiming increased as the size of the group increased. Groups of four or less look relatively reasonable, claiming more responsibility than is logically possible, but at least being in the vicinity of 100 percent. As the number of group members increase, however, their judgments get increasingly unhinged from reality. By the time you get to groups of eight, these MBAs were claiming nearly 140 percent productivity! This brings new meaning to overachieving.
The important point is to relax a bit when others don’t seem to appreciate you as much as you think they should. The mistake may be a product of egocentrism in your own head rather than others’ indifference.
On Self-Centered Stage
Thinking that others should give you more credit than they actually do is just a small part of our larger egocentric tendencies. At extreme levels, self-centered thinking can lead to paranoia, a belief that others are thinking about you, talking about you, and paying attention to you when they are not. That sounds crazy because it is, and yet all of us are prone to momentary bouts of craziness in the right circumstances. “People are insanely self-conscious,” Elaine Miller told The New York Times. Miller is the author of the popular interior decorating blog Decorno, explaining the care socialites take to convey just the right image to others with their interior decorating. “People act like they’re always being watched. Even their house is a performance.”
We’ve all been there at one time or another. Maybe you slipped on an icy sidewalk and were hurt more by the pain of embarrassment than the pain of the fall? Or said something stupid in a meeting and were certain that everyone was ridiculing you afterward in whispers? Or forgot the name of a new but important acquaintance and felt mortally embarrassed? All of the world may indeed be a stage, and it’s easy to feel that we’re at the center of it. Not only can this lead us to overestimate our impact in the world, it can even lead us to overestimate the extent to which others are noticing our very existence.
Consider participating in what I think is one of the most liberating experiments ever conducted—the Barry Manilow experiment. Here, researchers recruited unknowing undergraduates to participate in what they believed was a standard psychology experiment. Imagine you are one of them. When you arrive at the lab, the experimenter leads you down the hallway into a small room, shuts the door behind you, and asks you to “put on this T-shirt as part of the experiment.” You unfurl it before you and there, in all of its glory, is a shirt emblazoned with a large picture of Barry Manilow. You might be a big fan, but most people are not. Even fans might be a bit reluctant to bare their enthusiasm in a full-frontal exposure.
But you go along anyway, put on the T-shirt, and follow the experimenter back down the hallway; she leads you into a room where other participants are already sitting (none, of course, wearing a Manilow shirt). The experimenter explains that you’re running a bit late but you can still participate anyway. So you slink into your chair, at which point the experimenter appears to have a change of heart, apologizes, and says that it’s really too late and that you’ll need to do this experiment another time. She then leads you out of the room.
Here comes the most important part. The experimenter tells you that the experiment is actually now over and asks you to estimate the number of people in the room who would be able to identify the person on your shirt. While you are outside the room, the experimenter asks the other people sitting in the room to identify who was on your shirt. Those wearing the shirt estimated that nearly 50 percent would notice their Manilow shirt when, in fact, only 23 percent actually did. Even in small groups, the social spotlight does not shine on us nearly as brightly as we think.
The point here is that few of us are quite the celebrity that our own experience suggests we might be; nor are we under as much careful scrutiny from others as we might expect. Early on in Casablanca, Peter Lorre learns this lesson the hard way when he looks to Humphrey Bogart for some recognition, saying, “You despise me, don’t you?” Bogart replies, “If I gave you any thought, I probably would.” I think we would all benefit from having our own Casablanca moment.
The Eyes of Experts
Your own beliefs serve as a lens for understanding what others are likely to believe, as well as how strongly they are likely to believe it. But your mind contains multitudes, and beliefs are not the only lens that can alter your perceptions. Knowledge can also do it. For example, read the following sentence:
FINISHED FILES ARE THE RESULT
OF YEARS OF SCIENTIFIC STUDY
COMBINED WITH THE
EXPERIENCE OF YEARS.
Now please go back and count how many f’s appeared in that sentence. This is important. I’ll wait for you.
How many did you find? More than you can count on one hand? If not, then we have just confirmed that you are a terrific reader but a terrible counter. Try it again. Look harder. I’ll be patient.
Found all six yet? Don’t forget that “of” has an f in it.
See them all now? Most people who read this sentence fail to spot all six of the f’s on their first pass. Instead, most see only three. Why so few? This example has nothing to do with your beliefs and everything to do with your knowledge. Your expertise with English blinds you from seeing some of the letters. You know how to read so well that you can hear the sounds of the letters as you read over them. From your expert perspective, every time you see the word “of” you hear a v rather than an f and, therefore, miss it. This is why first graders are more likely to find all six in this task than fifth graders, and why young children are likely to do better on this than you did as well. Your expert ears are clouding your vision.
This example illustrates what psychologists refer to as the curse of knowledge, another textbook example of the lens problem. Knowledge is a curse because once you have it, you can’t imagine what it’s like not to possess it. You’ve seen other people cursed many times. For instance, while on vacation, have you ever tried to get driving directions from a local? Or talked to an IT person who can’t explain how to operate your computer without using impenetrable computer science jargon? In one experiment, expert cell phone users predicted it would take a novice, on average, only thirteen minutes to learn how to use a new cell phone. It actually took novices, on average, 32 minutes.
Twitter does not allow others to understand your perspective. It only allows others to confirm how stupid they already think you are.
The lens of expertise works like a microscope, allowing you to notice subtle details that a novice might not catch but also sharpening your focus in a way that can allow you to miss the bigger picture and make it difficult to understand a novice’s perspective. Trying to correct this lens first requires becoming aware of its influence. The problem is that it’s hard to know when you are being affected by your own expertise and when you are not. Consider what is probably the most famous dissertation experiment in the history of psychology: Elizabeth Newton’s “tapping study.” In this experiment, carried out with pairs of subjects, one volunteer in each pair was randomly assigned to be the “tapper” and the other the listener. Tappers received a list of 25 songs well known to them, including “America the Beautiful” and “Rock Around the Clock.” Tappers were asked to pick out three songs and then tap out each one for the listener, while they sat back-to-back. Tappers then estimated the likelihood that listeners would identify each tune correctly, and listeners tried to identify each one. The results were striking. Tappers estimated that listeners would identify the song correctly, on average, 50 percent of the time. In fact, listeners guessed correctly only 2.5 percent of the time.
It’s now easy to understand the gap between tappers and listeners. The tappers were relative experts, being very familiar with the song they were tapping out and hearing it in their own minds while doing so. The listeners, however, were privy to none of this orchestration; instead, they heard only the equivalent of musical Morse code. The important point is that the tappers were simply unable to appreciate how the rich music in their mind’s stereo would sound to a listener hearing it through their finger’s speakers.
None of us communicate by tapping alone, but the lens problem affects anyone who has unique knowledge of anything: the boss who understands a proposal inside out and is trying to convey the ideas to new clients, the inventor who knows precisely why her invention is so important speaking to impatient venture capitalists, or the coworker who is “just teasing” a new hire who knows nothing of the teaser’s friendly intentions. The expert’s problem is assuming that what’s so clear in his or her own mind is more obvious to others than it actually is.
Blankish States, E-Mail, and God
The problem of expertise is one of many examples of mistakes that come from projecting our own minds onto others: assuming that others know, think, believe, or feel as we do ourselves. Of course, we do not project ourselves onto others completely. We do so in some situations more than in others, and we project more onto some minds than others. The less we know about the mind of another, the more we use our own to fill in the blanks. Conservatives and liberals don’t know what the “average” person thinks, or how people who didn’t vote would have voted, and so they rely more on what they think themselves. Ask conservatives and liberals what their neighbor thinks, what their parent thinks, or what their spouse thinks and you are likely to see much less egocentrism. The lens problem therefore becomes larger as other minds become more unknown.
Understanding this allows you to explain, simultaneously, the problem with e-mail and the problem with God.
Let’s tackle the big one first: e-mail. Much of what we communicate to others depends not only on what we say but on how we say it. The same comment about one’s “nice hair,” “great question,” or “brilliant idea” can be taken as a compliment or an insult, depending on the tone of your voice or the smirk on your face. None of this subtlety makes it into your in-box. Although it’s not as bad as tapping out a novel in Morse code, text-based mediums like e-mail and Twitter nevertheless communicate the content of what is said but little of the subtle context of how it is said, making them considerably more ambiguous and open to egocentric influences than face-to-face communication.
Consider an experiment that highlights the lens problem in such ambiguous communication. My collaborators and I asked one group of volunteers to write two different sentences about 10 topics, one intended to be serious and one intended to be sarcastic. The topics were whatever came to Justin Kruger (my collaborator) and me while brainstorming in Justin’s office—things like food, cars, California, dating, and movies. We then asked each of our volunteers to convey these messages to another person in the experiment. In one condition, they sent the message via e-mail; in the other, they spoke it over the telephone. Our senders predicted that they could communicate just as well via e-mail as they could over the phone (roughly 80 percent accuracy in both cases). Those actually receiving the messages, however, could understand the speaker’s intention only when the communication was clear (that is, when the speaker was on the phone). With e-mail, the receivers were no more accurate than you’d expect from a coin flip.
The problem for our volunteers was that they knew whether their message was meant to be sincere or sarcastic. So when they said, “Blues Brothers 2000—now, there’s a sequel,” they could hear the sarcasm dripping from their voice regardless of whether they were actually using their voice or typing with their fingers. Those receiving the message, of course, could hear the sarcasm only through the speaker’s voice and heard nothing from the speaker’s fingers.
Not only was the ambiguity in the text unclear to the senders, it was unclear to the receivers as well. At the end of the experiment, we asked the receivers to guess how many of the items they had interpreted correctly. They thought they had done a superb job, interpreting nine out of 10 of the sentences correctly, regardless of whether the communication had been over the phone or by e-mail. Here you can see why ambiguous mediums like e-mail and texting and Twitter are such fertile ground for misunderstanding. People using ambiguous mediums think they are communicating clearly because they know what they mean to say, receivers are unable to get this meaning accurately but are certain that they have interpreted the message accurately, and both are amazed that the other side can be so stupid.
We could not tell the difference in overall neural activity between people reasoning about their own beliefs versus God’s beliefs.
As the context in which you’re trying to understand another mind becomes more ambiguous, the influence of your own perspective increases. If you really want to understand your coworker or competitor or children, don’t rely on modern mediums of communication that give you only a modern Rorschach test about the mind of another person. Twitter does not allow others to understand your deep thoughts and broad perspective. It only allows others to confirm how stupid they already think you are.
Now to the other problem: God. Just as the medium through which you communicate can be more or less ambiguous, so, too, can the target you’re reasoning about. You do not need to rely on your own beliefs to know that Barack Obama is liberal and George W. Bush is conservative. Both express their beliefs loud and clear, they are identified with liberal and conservative parties, and other people can tell you that they are, respectively, liberal and conservative. Their beliefs are relatively obvious. So, too, with your spouse, your friends, your kids, and your neighbors, who can respond to questions you ask them. Even the general public can answer opinion polls. But the less willing or able others are to give you a piece of their minds, the more their minds become a blank slate onto which you project your own.
Enter God. Believers consult few figures more often than God when it comes to weighty measures, from moral issues such as gay marriage, abortion, and martyrdom to personal issues, such as career planning or dating choices. The problem is that God doesn’t answer opinion polls, and the books that supposedly report God’s beliefs are notoriously open to interpretation. Many of the world’s wars are still fought over what God apparently does or does not want, fueled by the sense of having God on one’s own side. “Both read the same Bible and pray to the same God,” Lincoln noted during his second inaugural address, at the height of the Civil War, “and each invokes His aid against the other. It may seem strange that any men should dare to ask a just God’s assistance in wringing their bread from the sweat of other men’s faces, but let us judge not, lest we be not judged.” Sadly, few people have Lincoln’s gift of self-reflection. Did Jesus believe that small government or big government would more effectively help the poor? Does religion condemn gay marriage or condone it? Does God want you to get that mortgage? Does God want you to be rich? Lloyd Blankfein, the CEO of Goldman Sachs, once told a reporter that he was just a banker “doing God’s work.” I believe that this assessment is open to debate.
Like any belief, these beliefs about God surely come from multiple sources. On some issues, the religious position is clear and one’s own beliefs are unlikely to play any role. Religion creates beliefs, after all. But where there is wiggle room, God may become another example of the lens problem, created in one’s own image. Many have suggested this hypothesis. Xenophanes, a sixth century B.C. Greek philosopher, was the first person to describe anthropomorphism; he noted how Greek gods invariably had flowing hair and fair skin, whereas African gods had curly hair and dark skin. As Darwin put it, humans interpreting religious experiences “would naturally attribute to spirits the same passions, the same love of vengeance, or simplest form of justice, and the same affections which they themselves feel.” Bob Dylan even put the sentiment to music, in the song “With God on Our Side.” But understanding the lens problem suggests an even more specific prediction than these generalities. If religious agents are more ambiguous than other people, then believers might be even more egocentric when reasoning about God’s beliefs than when reasoning about other people’s beliefs. Other people agree with me, but God is really on my side.
Several pieces of evidence support this possibility, at least within monotheistic conceptions of God. In surveys, my collaborators and I have consistently found a stronger correlation between people’s own beliefs and their predictions of God’s beliefs than with their predictions about other people’s beliefs. From attitudes on abortion to support for same-sex marriage or the death penalty, Judeo-Christian believers’ own attitudes match what they think God believes much more closely than what they think other people believe. These are consistent results, but they’re nothing more than correlations. The opposite causal direction is also completely plausible: people come to believe what they think their God believes.
More compelling evidence comes from a neuroimaging experiment. We asked volunteers to report their own beliefs, God’s beliefs, and the average American’s beliefs on a wide variety of social issues while they were lying on their backs in an fMRI scanner. We found some clear distinctions. Major differences in neural activity emerged when people reasoned about their own beliefs and the average American’s beliefs. We found the very same pattern of differences when people reasoned about God’s beliefs versus the average American’s beliefs. But the most amazing result of all was that we could not tell the difference in overall neural activity between people reasoning about their own beliefs versus God’s beliefs. In the scanner, reasoning about God’s beliefs looked the same as reasoning about one’s own beliefs.
The most compelling evidence, however, comes from experiments in which we manipulated people’s own beliefs and measured how it affected what people think God and others believe. In one, we showed volunteers persuasive arguments either in favor of affirmative action. The arguments worked: those who read the pro–affirmative action information became more in favor, whereas those who read the anti–affirmative action arguments became more opposed. More important, our manipulation moved our volunteers’ estimates of God’s beliefs in lockstep with their own, whereas estimates of other people’s beliefs were unaffected by the arguments the volunteers read. Creating God in one’s own image, indeed.
If God is a moral compass, then the compass seems prone to pointing believers in whatever direction they are already facing. There’s nothing magical about God in this regard, just something ambiguous. When legislators speak of the Founding Fathers’ intentions while interpreting the Constitution or politicians talk about what “the people” want, you are likely witnessing an act of divination that tells you more about the speaker’s own beliefs than their target’s beliefs. The injunction here is not for more cynicism when listening to others but, rather, for more humility when it seems that other people, gods, founding fathers, or legal teams are, in fact, on your side. When others’ minds are unknown, the mind you imagine is based heavily on your own.
Nicholas Epley is the John Templeton Keller Professor of Behavioral Science at the University of Chicago Booth School of Business. He was awarded the 2011 Distinguished Scientific Aware for Early Career Contribution to Psychology from the American Psychological Association. His book, Mindwise: How We Understand What Others Think, Believe, Feel, and Want, won the 2015 Book Prize from the Society for Personality and Social Psychology.
From the Book:
Mindwise: Why We Misunderstand What Others Think, Believe, Feel, and Want by Nicholas Epley
Copyright © 2014 by Nicholas Epley
Published by arrangement with Alfred A. Knopf,
an imprint of The Knopf Doubleday Publishing Group, a division of Penguin Random House LLC