Nautilus Members enjoy an ad-free experience. or Join now .

Are your neurons thrown off by this photo? Shutterstock

Most of us have an uneasy love/hate relationship with celebrity culture. No matter how much we try to pretend we’re above it all, celebrities somehow seep into our consciousness, whether it’s Miley Cyrus’s cringe-inducing twerking at the VMAs, or our enduring affection for the ensemble cast of The Big Bang Theory—or, in an earlier era, Friends. Then again, our fascination with celebrities just might lead to important breakthroughs into our understanding of how the brain stores and retrieves memories.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Eight years ago, neuroscientists at UCLA working with a small group of epileptics discovered a neuron that appeared to respond exclusively to images of the actress Jennifer Aniston (who played Rachel on the aforementioned Friends). It didn’t matter whether the image showed just her face of full body; this one neuron in this one subject’s brain—specifically, in the medial prefrontal cortex—lit up in response.

There were some caveats. The neuron didn’t light up in response to images where Aniston appeared with her then-husband, Brad Pitt, but it did respond to images of her Friends co-star, actress Lisa Kudrow. This indicated that the cell was actually responding to a concept—the character of Rachel on Friends—rather than to Aniston herself. And it’s not like every one of us has a Jennifer Aniston neuron; other subjects in the study had neurons that responded to images of actress Halle Berry, as well as images of Oprah Winfrey, Luke Skywalker, or Yoda.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

“For things that you see over and over again, your family, your boyfriend, or celebrities, your brain wires up and fires very specifically to them,” neuroscientist Christof Koch told New Scientist in 2005 when the paper was first published. “These neurons are very, very specific, much more than people think.”

Neuroscientist Rodrigo Quian Quiroga, who led the study, has dubbed such neurons “concept cells.” It’s similar to the notion of “grandmother cells”—a hypothetical scenario proposed by MIT’s Jerry Lettvin in the late 1960s specifically to demonstrate why the idea that single brain cells could respond to specific objects or people was over-simplified. He described a fictional neurosurgeon whose patient wanted to forget his mother entirely—shades of the quirky Charlie Kaufman film, Eternal Sunshine of the Spotless Mind, in which a heartbroken man wants to erase all memory of his ex-girlfriend. The neurosurgeon complied, removing several thousand neurons associated with the patient’s mother—but then the patient wanted to forget his grandmother, too.

Put in those terms, it seems ludicrous. Granted, the brain could have as many as 100 billion neurons, but even so, it seems unlikely that each neuron would correspond one-to-one with every object or person we encounter in our lives. And if you happened to lose your Jennifer Aniston neuron, would Friends even make sense to your brain anymore? (“Who is that blonde woman fighting with Ross, insisting they weren’t on a break?”)

Quiroga’s “concept cells” clearly play an important role in memory formation via associations, such as linking Aniston with her Friends co-star, Kudrow, but not with Pitt. But the actual process by which this occurs is far more complicated than the explanation offered the original notion of “grandmother cells.”

Nautilus Members enjoy an ad-free experience. Log in or Join now .

For example, powerful emotions make for stronger connections and associations, and for more vivid memories, thanks to the amygdala and the powerful neurochemicals it trucks in (dopamine, serotonin, epinephrine, norepinephrine, and acetylcholine). The cells that produce those chemicals are found in the brain stem, from which axons branch out into every other area of the brain to produce a coordinated sensory response. Fear and anxiety, or heart-pounding excitement and euphoria, are the result of those cells flooding the neurons in various brain regions with neurochemicals, although only active cells are affected. Different regions process and store different sensory aspects of a given experience, so this widespread flooding of neurochemicals ensures that all those regions record information from the event. The cortex then integrates the information, and the experience becomes part of our memory.

When it comes to memory retrieval, many researchers maintain that memory does not operate like a computer, where stored files are pulled and re-opened intact. Our memories are distributed over several different regions of the brain, and each time we recollect an event, we are, in essence, reconstructing it from scratch, based on a few key clues.

Koch and others, including Quiroga, have surmised that any one “concept” might require 20,000 to 100,000 neurons for full representation. That’s a small portion of the entire brain, to be sure—what neuroscientists call a “sparse network”—but not nearly as sparse as a single grandmother cell. This design is just sufficient to enable us to make rapid associations, and may have evolved as a fast-moving survival mechanism—“Is this friend or foe?”—rather than to help us keep track of complicated thoughts like, “Is this the one where Rachel flies to England to crash Ross’s wedding and ends up sitting next to Hugh Laurie on the plane?”

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Jennifer Ouellette is a science writer and the author of The Calculus Diaries and the forthcoming Me, Myself and Why: Searching for the Science of Self. Follow her on Twitter @JenLucPiquant.

close-icon Enjoy unlimited Nautilus articles, ad-free, for as little as $4.92/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

Access unlimited ad-free articles, including this one, by becoming a Nautilus member. Enjoy bonus content, exclusive products and events, and more — all while supporting independent journalism.