Something like 10,000 years ago, near Lake Turkana in Kenya, a group of hunter-gathers went on a raid, sparing a few young men and women to take as the spoils of victory. At the scene were dozens of corpses whose bones contain evidence of what took place: lodged arrows, head and body trauma from different types of clubs, and crushed skulls. The remains of one female victim, who was evidently pregnant, leave little room for doubt that this was a gruesome massacre—and possibly the earliest case of organized warfare among humans.
What researchers found there was “unique,” wrote Marta Mirzaon Lahr, a Cambridge evolutionary anthropologist, and her colleagues, in Nature last year. The slaughtered remains were, they wrote, “preserved by the particular conditions of the lagoon with no evidence of deliberate burial.” Finding sites containing prehistoric skeletal remains may be relatively common, but this one, they concluded, offers “a rare glimpse into the life and death of past foraging people, and evidence that warfare was part of the repertoire of inter-group relations among prehistoric hunter-gatherers.”
We often say that war should be the last resort—the thing we regrettably do when diplomacy fails. Yet it may have been the first resort for much of human evolutionary history. Something must have changed if there’s since been a historic decline of violence, as Steven Pinker argued in his 2011 book, The Better Angels of Our Nature: Why Violence Has Declined. He says you can count on one hand the “historical forces” that have led to this decline. They include large-scale commerce (“other people,” he writes, became “more valuable alive than dead”), the rise of judiciaries and the monopoly on the legitimate use of force by modern nation-states (which can, among other things, “defuse the temptation of exploitative attack”), and the spread of reason (which, he writes, “can force people to recognize the futility of cycles of violence” and reframe it as a “problem to be solved than a contest to be won”).
But Gregory Clark, a professor of economics at University of California, Davis, has another explanation. He believes that our appetite for violence declined over thousands of years because humans evolved to be “more patient.” In a 2007 paper, Clark argues that “there is evidence that the long Malthusian era in stable agrarian societies actually changed human preferences, perhaps culturally but also perhaps genetically.”
“If the Soviet Union had overreacted, it could have gone very badly.”
“In order to be successful in modern economic life,” Clark says, “we have to be willing to defer benefits. If you look at hunter-gatherer society, people are often very present-oriented. They chopped down a fruit tree to get the fruit, even though it’s in their territory, and so it means the next year, there won’t be any fruit.” The shift away from impulsiveness toward the more organized, systematic behavior characteristic of agrarian lifestyle, Clark says, may have influenced the decline in our violent tendencies by curbing “very present-oriented,” or impatient tendencies, more generally.
Andrey Anohkin, a professor of psychiatry who studies the genetic and neurobiological bases of human individual differences, at Washington University in St. Louis, agrees. “Forms of impulsive behaviors involve discounting of future consequences, including both rewards and punishments,” he says. “Violence is often impulsive.”
If we evolved to be more patient, then patience in some form must be heritable. That seems to be the case. In a 2014 study, titled “The Genetics of Impulsivity: Evidence for the Heritability of Delay Discounting,” Anohkin and his team examined pairs of adolescent twins and found that “the extent to which individuals tend to discount delayed consequences of their actions in favor of immediate rewards are substantially influenced by genetic factors.” The genes linked to the brain’s serotonin and kappa opioid receptors, associated with depression and addiction, can affect a person’s level of impulsiveness. What’s more, delayed discounting also seems to be “temporally stable,” meaning the trait’s impact on behavior doesn’t really fluctuate over your lifetime. That may not be a surprising finding given the results of follow-up studies to the famous 1972 “Marshmallow Experiment.”
In the initial experiment, the 26 subjects, all of whom were toddlers at the time, were given a single marshmallow and told that if they didn’t eat it for 15 minutes, they would receive a second marshmallow. In a 1990 follow-up study, it turned out that the kids who waited for the second marshmallow went on to perform better on the SAT compared to those with lower capabilities of delaying immediate gratification. And a 2011 study—40 years after the original experiment—found that the high delayers showcased greater activity in a region of the prefrontal cortex, associated with impulse and behavior control, than the low delayers. “Resistance to temptation,” the researchers concluded, “…is a relatively stable individual difference.”
We are largely the ancestors of those who lived in patience-inducing agrarian societies. So you might wonder whether the greater patience we may have inherited could have, at one crucial point in recent history, helped avert nuclear holocaust. In 1983, Stanislav Petrov, a former Soviet Union officer, received an alert that the United States had fired off five nuclear missiles. Rather than suggest immediate retaliation, Petrov maintained composure and took the time to consider the situation. In the end, he correctly assessed it to be a false alarm.
“If the Soviet Union had overreacted, it could have gone very badly,” the former KGB officer Oleg A. Gordievsky told the Baltimore Sun, in 2003, about Petrov’s decision. “If war had come, Soviet missiles would have destroyed Britain entirely, at least half of Germany and France, and America would have lost maybe 30 percent of its cities and infrastructure.”
Matthew Sedacca is an editorial intern at Nautilus.
WATCH: Simon DeDeo on whether culture is a self-determining force in nature.