When anthropologist Alyssa Crittenden began studying the Hadza people of Tanzania 10 years ago, she was surprised to see an 8-year-old girl head out to forage for golden kongolobe berries with her 1-year-old niece swaddled snugly on her back. The behavior starkly contrasted Crittenden’s own experience growing up in the United States, where mothers often raise infants alone or with little help. But the Hadza live much as their ancestors did 10,000 years ago. In addition to hunting and gathering, Crittenden learned, Hadza members take turns caring for one another’s children—a practice anthropologists call “infant sharing.”
In fact, infant sharing is so widespread among traditional societies that scientists believe it played a key role in human evolution. Remember the old adage “It takes a village to raise a child?” “It turns out that [saying] has very deep evolutionary roots,” says Crittenden, now at the University of Nevada, Las Vegas.
More evidence for the evolutionary origins of infant sharing can be found in other primate species. About half of the roughly 200 species of primates alive today exhibit the practice, says Sarah Hrdy, an evolutionary biologist at the University of California, Davis. Newborn langur monkeys, for example, cling to the backs and bellies of other group members for at least half of their first day of life. In societies of wide-eyed owl monkeys and orange-bearded titi monkeys, fathers do most of the coddling, passing babies back to their mothers only for brief visits, such as when it’s time to nurse. (Chimpanzees are a notable exception; mothers don’t rely on anyone else for childcare during the first six months of a baby’s life.)
But why would infant sharing be an evolutionary advantage? For one possible answer, just look at human development, says Chris Kuzawa, an anthropologist at Northwestern University. In a 2014 study in the Proceedings of the National Academy of Sciences, he and his colleagues used brain-imaging data to calculate how much glucose (energy) the brain needs from birth to adulthood. They found that the brain demands the most sustenance between ages 3 and 7—a crucial period of brain development.
By this time, Kuzawa points out, a paleolithic mom probably would have been pregnant again, or already nursing another kid. So how did her firstborn get the nourishment he needed to grow a big, healthy brain? The child bonded with other clan members, mothers as well as non-mothers—a skill we still use today, Kuzawa argues. “Whoever is a part of our lives and cares for us is who we attach to. It’s pretty flexible.”
And while children evolved to bond with us, we may have evolved to want to care for them. Research suggests that simply witnessing motherhood primes us to act as caretakers, even to babies we didn’t birth. One study from 2000, for instance, found that prolactin, a hormone associated with nurturing, increased in men when they were living with a pregnant woman. Another study from 2010 showed that fathers experienced a bump in oxytocin, a hormone associated with social bonding, after spending time with their newborns.
Yet despite evidence that infant sharing evolved as a pervasive—and likely beneficial—trait in humans, many people in western societies criticize mothers who leave their young children at daycare or with nannies. In the U.S., for instance, 44 percent of stay-at-home moms think working mothers are a “bad thing for society,” according to a 2007 survey by the Pew Research Center.
But recent studies suggest that children benefit from working mothers, becoming daughters who are high achievers, for example, or sons who share more of the housework. And Crittenden, for one, is comforted to know “that our species evolved as cooperative breeders.” As a working mom, she feels less guilty for leaving her child at daycare. “For me this is very liberating.”
Regan Penaluna is a senior editor at Guernica magazine.