ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .
Sign up for the free Nautilus newsletter:
science and culture for people who love beautiful writing.
NL – Article speedbump
Explore

Seeing the striking magenta of bougainvillea. Tasting a rich morning latte. Feeling the sharp pain of a needle prick going into your arm. These subjective experiences are the stuff of the mind. What is “doing the experiencing,” the 3-pound chunk of meat in our head, is a tangible object that works on electrochemical signals—physics, essentially. How do the two—our mental experiences and physical brains—interact?

Nautilus Members enjoy an ad-free experience. Log in or Join now .

The puzzle of consciousness seems to be giving science a run for its money. The problem, to be clear, isn’t merely to pinpoint “where it all happens” in the brain (although this, too, is far from trivial). The real mystery is how to bridge the gap between the mental, first-person stuff of consciousness and the physical lump of matter inside the cranium.

Some think the gap is unbreachable. The philosopher David Chalmers, for instance, has argued that consciousness is something special and distinct from the physical world. If so, it may never be possible to explain consciousness in terms of physical brain processes. No matter how deeply scientists understand the brain, for Chalmers, this would never explain how our neurons produce consciousness. Why should a hunk of flesh, teeming with chemical signals and electrical charges, experience a point of view? There seems to be no conceivable reason why meaty matter would have this light of subjectivity “on the inside.” Consciousness, then, is a “hard problem”as Chalmers has labeled itindeed.

Can a person lack conscious experience, even if their body looks just like mine?

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

The possibility that consciousness itself isn’t anything physical raises burning questions about whether, for example, an AI can fall in love with its programmer. And since consciousness is a natural phenomenon, much like gravity or genes, these questions carry huge implications. Science explains the natural world by physical principles only. So if it turns out that one natural phenomenon transcends the laws of physics, then it is not only the science of consciousness that is in trouble—our entire understanding of the natural world would require serious revision.

But before we run off too fast and far, let us pause, take a breath, and reconsider. The mind is hard to explain in physical terms—this much is obvious. Why it is hard, however, is far from evident. In fact, this question is open to two competing explanations. The first explanation puts the blame on what consciousness isthat it is not physical, as Chalmers claims. Alternatively, this impression (that consciousness isn’t physical) could arise from within, as a result of human bias—a psychological delusion.

Psychological biases are relevant because philosophers and scientists heavily rely on their intuitions as they try to explain what consciousness is. Is it special—can it reveal more about the world than what I can infer from my reason alone? Does my conscious experience seem physical? Can a person lack conscious experience, even if their body looks and works just like mine?

In these mini-thought experiments, intuitions are data. Since consciousness generates intuitions, intuitions can speak to what consciousness is, at least in principle. The problem is that the “psychology of psychology” is a tricky business.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Human cognition, as we know, is laced with biases—we can readily recognize these distortions in visual illusions, auditory hallucinations, and logical fallacies. So, if our perception of the external world is distorted, why assume our internal perception is truthful? In fact, people are demonstrably plagued by multiple biases that cloud their reasoning about how their own psyche works.

Intuitive dualism is one such psychological bias. It suggests to us that the mind is ethereal, distinct from the body. My research suggests that intuitive dualism arises in humans naturally and spontaneously—it emerges from the two innate systems: One guides our understanding of the physical properties of objects; another helps us “read” the minds of others. So it is not the product of rationally analyzing what exists. It is a psychological delusion that arises from within the human mind itself.

However, intuitive dualism has been shown to give rise to various prejudices, ranging from the denial of human nature to our misguided fascination with neuroscience and the tendency to stigmatize people with psychiatric disorders. Our consciousness intuitions could arise from the same source as these biases.

So, if consciousness appears somehow distinct from physical reality, then this conclusion could well arise not from what consciousness really is, but rather from what our psyche is telling us, courtesy of intuitive dualism.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

How can we tell, then, whether or not our intuitions (that consciousness is not physical) reflect what consciousness really is? It looks like we are at a stalemate. And that’s a problem for the “hard problem.”

Recent research helps us move forward. To see how, suppose you inspect yourself in the mirror: Your face has an unhealthy greenish hue. Before rushing to the ER, you step outside and reexamine your image in natural light. The greenish appearance changes, you are all clear; the strange color was likely an artifact of the lighting. Shifting intuitions are diagnostic—they can help us identify our own biases.

The same logic can help us sift consciousness fact from fiction. If our intuitions about consciousness faithfully reflect what consciousness is, and if the nature of consciousness is invariant—meaning it doesn’t change from being physical to non-physical—then our intuitions about the nature of consciousness should also not change. In other words, they should not vary by context. But if consciousness intuitions shift, such that, in some situations, consciousness seems ethereal, and in others, it seems physical, then it’s likely something about our psyche that explains these intuitions, not consciousness itself.

When such shifts are detected, it’s a telltale sign of a psychological delusion, just like the shifts of your greenish complexion above. And if we could further explain how these shifts arise from within—by spelling out their psychological causes, then our confidence in this psychological explanation would increase further. Such shifts in consciousness intuitions, then, suggest that our intuitions can’t be trusted to reveal reliable information about the nature of our minds. This is precisely what psychological experiments probing our intuitions about consciousness show.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Why should a hunk of flesh experience a point of view?

Sometimes this involves asking people to consider the idea of a philosophical zombie. Philosophical zombies are hypothetical creatures that lack consciousness yet everything else physical about them—including their biology and brain chemistry—matches humans perfectly. The idea is that if someone believes consciousness is distinct from physical matter, then that person should think philosophical zombies could, in theory, exist. And if philosophical zombies are conceivable, then perhaps the possession of an intact human body does not guarantee consciousness. Consciousness, so the argument goes, is thus distinct from the physical.

When philosophers are surveyed, most respond that, indeed, zombies are conceivable. But what about most people—do they, too, share these intuitions?

In a 2021 paper published in the journal Cognition, researchers Eugen Fischer and Justin Sytsma had participants think about whether the idea of a “philosophical zombie” made sense. To find out, they asked people to rate whether philosophical zombies would be capable of having conscious experiences, feelings, and emotions. Responses hovered around the “neutral” midpoint of the seven-point rating scale (4, “neither agree nor disagree”). Few participants, to be sure, outright denied that zombies are conscious, and fewer yet denied that zombies are conscious while affirming that they have a functional, humanlike body (as the instructions to the experiment suggested). This last result could either indicate that most people cannot fully conceive of philosophical zombies or that participants are simply reluctant to respond “no” (e.g., No, zombies aren’t conscious!).

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Indeed, in a 2022 study published in the Proceedings of the National Academy of Sciences, my colleagues and I showed that when participants are forced to give a binary response (instead of a seven-point rating scale that allows them to “sit on the fence”), participants state that a perfect replica of a person’s body will not maintain their mental states—thoughts and beliefs. Insofar as consciousness is a mental state, this would imply that zombies do not seem possible.

Either way, participants in Fischer and Sytsma’s study certainly did not ascribe much conscious experience to zombies, even though the problem framing clearly established that these creatures have a human body. Consciousness, for them, seems to be at least partly distinct from the physical.

So far, then, it looks like most people intuit that consciousness is ethereal; at least, that is what they say about philosophical zombies. It is therefore tempting to conclude that what’s true for zombies could hold for the many other scenarios that philosophers have used to probe our consciousness intuitions. After all, if we assume that these scenarios shed light on what consciousness is—and for what we know, consciousness does not change—then what’s true in one case ought to be true in all.

The physicality case, it would seem, is closed; our intuitions tell us that consciousness isn’t physical. So strong is this tacit conviction that, to my knowledge, no one has bothered to check whether this is really the case. But it turns out that when most people are asked to consider a second famous case—the problem of Mary in the black and white room—people tend to view consciousness as squarely physical.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Mary, the thought experiment goes, is a neuroscientist who is an expert on color vision. Mary herself, however, has never seen color, as she lives in a black and white room. Now, suppose that Mary steps out of that room and sees a red rose for the first time. How significant is that conscious experience? Will it register or “show up” in Mary’s brain?

A series of experiments from my lab at Northeastern University presented participants with several versions of Mary’s case (a total of 180 people across four different experiments). Their task was to rate not only the significance of that new conscious experience (specifically, “How transformative is Mary’s experience seeing the color red? How much has her grasp of ‘red’ changed by seeing the red rose?”) but also its embodiment—whether it is likely to “show up” in Mary’s brain.

The philosophical literature leads us to expect that Mary’s new conscious experience is significant—it has utterly transformed her grasp of color, and participants agreed. In fact, their ratings in response to the “transformative” question (above) were significantly above the midpoint of the seven-point rating scale, so clearly, participants did view this experience as quite significant. But when asked whether Mary’s conscious experience will “show up” in a brain scan (i.e., in Mary’s physical body), they said it will!

Participants further believed that Mary’s first conscious experience of red is more likely to manifest in the brain scan than all of her “abstract” knowledge about color vision. In fact, the more likely participants were to state the “red” experience would “show up” in the brain, the more transformative it seemed. Thus, not only did participants consider Mary’s conscious experience as squarely embodied, but embodiment was also linked to the significance of this experience.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Consciousness does not change—what’s true in one case ought to be true in all.

In another experiment (from the same study), participants were asked to assume that Mary’s first encounter with the red rose happens in one of two conditions (presented to Mary as part of a carefully controlled experiment). One condition has Mary looking at the red rose in full view for several seconds; when asked to report her experience, Mary confirms seeing it. In a second condition, the red rose is presented for just a fraction of a second. When asked about what she saw, Mary reports seeing nothing. Despite this, participants are told, seeing the red rose for just a fraction of a second is likely to have registered in Mary’s brain since research has shown that after such subliminal presentations, the word “rose” comes to mind more readily.

The two conditions (the subliminal and conscious), then, are identical except that one engenders consciousness and the other doesn’t. By comparing them, we can directly examine intuitions about consciousness (as opposed to “seeing color”). If responses to the two conditions differ, then, this difference will shed light on how consciousness is perceived—whether it is transformative and whether it is embodied in Mary’s brain.

Results showed that participants considered Mary’s conscious experience (in the first condition) as more transformative than her subliminal experience (in the second condition); this is only expected from the philosophical analysis. But, contrary to what might be expected given the “hard problem,” participants also considered the conscious experience to be more likely to “show up” in Mary’s brain compared to the subliminal experience. And once again, “transformative” ratings were positively linked with the brain registering the experience, such that the stronger the intuitions that Mary’s conscious experience registered in her brain, the more transformative it seemed.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

These outcomes are striking for two reasons. First, the results show that, when people consider Mary’s case, consciousness seems to them squarely physical. This flies in the face of the common wisdom that consciousness seems not physical. Second, when compared with the zombie’s case, it appears that these psychological intuitions shift.

In the zombie’s case, consciousness seems ethereal; in Mary’s case, it seems physical. And if different thought experiments can produce such a radical shift in intuitions about the nature of consciousness, then our intuitions about subjective experience cannot possibly be trusted to reflect what consciousness really is like.  This means our intuitions about consciousness likely emerge from within—from psychological biases.

Our intuitions about consciousness are shaped by two competing psychological biases: intuitive dualism and essentialism.

A large body of literature suggests that people—adults and young children, across various societies and cultures—consider the mind distinct from the body. For intuitive dualists, philosophical zombies don’t seem so strange. Since dualists see the mind as ethereal, distinct from the body, they can readily imagine a creature that shares only our body, but not the inner light of conscious experience. From the outside, the creature seems just like any real person; but on the inside, there’s nobody home. They’re as conscious as a rock.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Had our intuitions about consciousness stemmed only from intuitive dualism, then consciousness should have always seemed ethereal, just as the case of zombies suggests. But psychological biases, such as intuitive dualism, do not operate in a vacuum; they often interact with conflicting biases. And when these interactions occur, a bias that was previously silent can suddenly become dominant. Critically, the “push and pull” dynamics between them can also be shaped by context. When context shifts—when people consider different thought experiments about consciousness—the role of dualism may be weakened. Consequently, intuitions about the link between consciousness and the brain shift, too.

Mary’s case invokes just that shift in intuitions. What’s different about Mary’s case (relative to zombies) is that it invites us to evaluate a change to Mary herself (her new experience with color), and as a result, we now focus on her body, more than her mind. This attenuates the effect of intuitive dualism and brings a second competing constraint into the forefront—intuitive essentialism.

Essentialism is the intuitive belief that living things are what they are because they possess some innate, immutable essence that lies within their bodies. Research has shown that when people evaluate a change to a protagonist, they assess whether the change pertains to the protagonist’s essence. And since that essence seems to lie within the body, it is the body that determines the significance of that change.

For example, young children believe that a change to a dog’s insides (like removing its blood and bones) amounts to changing the kind of thing that it is, whereas external changes (like removing its fur) will not, presumably because it is within the “insides” that the animal’s hidden essence lies.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Like the dog example, Mary’s case also features a change. So for an intuitive essentialist, Mary’s newly gained consciousness of the redness of the color red ought to be significant or transformative only if this change pertains to her bodily essence. It follows that, to engender a significant change, a new conscious experience must affect Mary’s body. Seeing color fits the bill, because “seeing” intuitively feels like an embodied affair that involves the eyes. Accordingly, participants viewed Mary’s conscious experience as “transformative.”

Moreover, seeing color is significant precisely because intuitively, this experience seems physically embodied. The results also showed that the more “embodied” Mary’s experience seemed, the more transformative it was. This link between “embodiment” and “transformativeness” is exactly what intuitive essentialism predicts.

Together, intuitive dualism and essentialism can both capture our conscious intuitions. The crucial point, however, isn’t just why and how consciousness intuitions shift. Rather, it is the fact a shift occurs that is critical. And since it does, you know you could be in trouble—your intuitions could well arise from your internal psychological biases. So, resist the temptation and do not blindly follow their delusional voice. Don’t trust your consciousness to tell you what consciousness really is.

Lead image: everything bagel / Shutterstock

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .
close-icon Enjoy unlimited Nautilus articles, ad-free, for less than $5/month. Join now

! There is not an active subscription associated with that email address.

Subscribe to continue reading.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.