Nautilus Members enjoy an ad-free experience. or Join now .

In a classic experiment in 1953, students spent an hour doing repetitive, monotonous tasks, such as rotating square pegs a quarter turn, again and again. Then the experimenters asked the students to persuade someone else that this mind-numbing experience was in fact interesting. Some students got $1 ($9 today) to tell this fib while others got $20 ($176 today). In a survey at the end of the experiment, those paid only a trivial fee were more likely to describe the boring activity as engaging. They seemed to have persuaded themselves of their own lie.

According to the researchers, psychologists Merrill Carlsmith and Leon Festinger, this attitude shift was caused by “cognitive dissonance,” the discomfort we feel when we try to hold two contradictory ideas or beliefs at the same time. When faced with two opposing realities (“This is boring” and “I told someone it was interesting”), the well-paid students could externally justify their behavior (“I was paid to say that”). The poorly paid students, on the other hand, had to create an internal justification (“I must have said it was interesting for some good reason. Maybe I actually liked it”).

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Scientists have uncovered more than 50 biases that, like this one, can mess with our thinking. For instance, there’s the “availability heuristic,” which makes us think something that’s easy to recall (because it’s emotional or because we’ve experienced it many times) is more common or probable than it really is. (Despite what you might think from watching CSI: Crime Scene Investigation, the world isn’t full of serial killers.) There’s also the “distinction bias,” which makes two options seem more different when considered simultaneously; the “denomination effect,” which makes us more likely to spend money when it’s in small bills or coins; and the “Dunning-Kruger effect,” which makes experts underestimate their abilities and laypeople overestimate theirs.

Such biases can still affect you even if you know all about them because they operate unconsciously. We judge whether we have a bias by examining our thoughts, and because we believe our thoughts are rational, we often think we’re not biased when we are. Psychologists call this contradiction the “bias blind spot.” Although we’re quick to see biases in others, we have more trouble noticing them in ourselves.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

As much as we may want to believe that thinking positively will lead to positive outcomes, the opposite might be true.

And the more we convince ourselves that we don’t have certain biases, the more likely we are to exhibit them. If we believe we’re good people, for example, we may stop trying to be better and may be more likely to act indecently. Similarly, if we think we’re smart, we might skip studying for a test and give ignorant answers. In general, if we believe we’re unbiased, we’re giving ourselves permission to be biased.

The effect is best illustrated by the relationship between safety and risk: The safer we feel, the more risk we are willing to take. Imagine, for instance, that you’re walking on ice, carrying a bowl full of hot soup. You’d probably walk much slower than you would on a dry sidewalk with nothing in your hands. This behavior, known as risk compensation, also applies to health. When psychologist Wen-Bin Chiou gave volunteers sugar pills, he found that people who thought the pills were dietary supplements were less inclined to exercise than those who knew they were placebos. A review of road safety studies similarly showed that mandatory seatbelt laws can lead to increased fatalities for pedestrians, motorcyclists, and cyclists, suggesting that drivers may be more reckless when they feel protected.

The “moral credential effect” describes this compensation in the context of moral reasoning. When study subjects were given an opportunity to disagree with sexist statements, for example, they were then more likely to favor giving a stereotypically male job to a man instead of a woman (compared to people who weren’t exposed to the statements). Likewise, people who believed they were morally good were more likely to cheat on a math test. And people who wrote self-congratulatory essays chose to donate only one-fifth as much money to charity as people who wrote self-critical essays.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

As much as we may want to believe that thinking positively about ourselves and our lives will lead to positive outcomes, the opposite might be true: When we place too much trust in the goodness of our nature, we fail to notice when we’re bad.

Jim Davies is an associate professor at the Institute of Cognitive Science at Carleton University in Ottawa, where he is director of the Science of Imagination Laboratory.

close-icon Enjoy unlimited Nautilus articles, ad-free, for as little as $4.92/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

Access unlimited ad-free articles, including this one, by becoming a Nautilus member. Enjoy bonus content, exclusive products and events, and more — all while supporting independent journalism.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member. $9.99/month. Cancel anytime.