ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .
Sign up for the free Nautilus newsletter:
science and culture for people who love beautiful writing.
NL – Article speedbump

Investment companies are fond of warning that past performance is no guarantee of future returns. Don’t rely on the past, they say, to make decisions about what may lie ahead. And when it comes to planning for low-probability events, this seems like sound advice indeed. Disasters, pandemics, and the like often have a thin historic record. Surprisingly, though, the modern policymaker truly does follow Shakespeare’s edict, “What’s past is prologue.” The thin but growing data set on low-consequence, high-probability events, together with a heightened media awareness, is helping both the accuracy of models and the likelihood that they will be put into practice.

Nautilus Members enjoy an ad-free experience. Log in or Join now .

Consider earthquakes. Geologists like University of Southern California Professor Tom Jordan, who is also the director of the Southern California Earthquake Center, study the pattern of past earthquakes to help predict when the next Big One will hit. (A Big One is a quake that does extreme damage. In Southern California, a quake must reach a magnitude greater than 7.0 to qualify.) There have been precious few of these quakes, but a combination of historic evidence, hard geological data, and probability mathematics can bring the threat into at least some focus.

“If events occur randomly in time… then you can calculate their frequency,” Jordan says. “We know Southern California has Big Ones. We can see them geologically in the past—every 100 to 150 years. So suppose you say on average every 100 years, but randomly in time, what is likely to occur…that is a time-independent Poisson [probability distribution] model.”

Mathematics in hand, Jordan then goes looking for hard data, like the composition and hardness of the bedrock and surface soils, the rate at which opposing sides of fault lines are slipping past each other (the Pacific Plate is sliding past the North American Plate along the San Andreas Fault at about 33 to 37 millimeters per year), and the structure of the faults (geologists have identified three major sections of the San Andreas, each with its own record of quakes).

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

This combination of math and data produces what are known as earthquake hazard curves, which graph the probability of a particular magnitude of quake versus time for a given area. Using hazard curves, the Southern California Earthquake Center predicted in 2007 that California had a 99.7 percent chance of having a magnitude 6.7 or larger earthquake during the next 30 years, a 90 percent chance of a 7.0, a 46 percent chance of a 7.5, and a 4 percent chance of an 8.0. What’s more, because these models use local geological data, they can produce local predictions. The Southern California section of the San Andreas (from midway between San Francisco and Los Angeles down to the east of San Diego) has the highest likelihood of a quake of magnitude 6.7 or more: 59 percent. In second place is the section running just east of San Francisco, at 31 percent. Jordan boils it down this way: “The San Andreas Fault is locked and loaded.”

Past data is even more informative when it comes to planning how to protect us from earthquakes. John Ebel, Professor of Geology and Geophysics at Boston College and Director of its Weston Observatory, helped design one of the most commonly used models for this sort of work, called Hazus. It calculates the amount of ground shaking that results from various-magnitude quakes, for any given epicenter. It takes into account the nature of the fault, the sorts of waves it produces (so-called P waves and S waves cause different kinds of damage), and the nature of the bedrock and soil at each point being investigated. “Motion in mushy, soft soils like landfills next to [Boston’s] Logan Airport, for example, will be 50 percent higher,” Ebel says.

Of particular value to policy planners is Hazus’s ability to predict the dollar value of the damage that will result from a quake. Hazus uses “fragility curves,” equations that represent what engineering and physics can tell us about how different kinds of buildings will react to various levels of shaking. Wood buildings are more flexible than those built from masonry. Short ones fare better than tall ones. The model is then fed information about the number and nature of the buildings at each map point being investigated, as well as the actual costs of earthquakes in what Ebel calls “similarly-built environments.” These can be other places in the U.S., or in Japan or Europe, where the number and types of buildings, roads, and other infrastructure components are similar to those in the area being studied. “You can add in economic losses to transportation systems, how long it will take to get utilities back, how many fires might start, all based on previous experiences for a particular kind of constructed community,” Ebel says.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Hazus can even estimate injury and death. And to study hurricanes, floods, or tornadoes instead of earthquakes, you only have to replace the model’s inputs about ground shaking with data about wind speed, precipitation, or flood-zone maps (this is designated as Hazus Multi-Hazard, or MH).

“The San Andreas Fault is locked and loaded.”

Past disasters, then, can be useful in predicting when new ones will occur, and how to survive them. But there is another important way in which past disasters help the modern policymaker: by making us aware of the risk. Take the possibility of a major impact by an asteroid or other near-Earth object. The last impact that did serious damage was the Tunguska event over central Russia in 1908 (though there was no actual impact with the surface of the Earth—an asteroid or comet roughly 300 feet across apparently exploded in the air). Small wonder, then, that “20 years ago the near-Earth-object field practically didn’t exist,” says Don Yeomans, who heads NASA’s near-Earth-object program. Former astronaut Rusty Schweickart says Yeomans is “…one of the reasons we can all sleep a little better at night.” 

Back then, “we had ‘the giggle factor’ when it was mentioned that these objects could be dangerous and could be looked for,” Yeomans says. “People would laugh and say ‘Yeah, when was the last time?’ Simply because we didn’t see them, they didn’t take the threat as seriously as we have come to.”

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

But, in 1993, astronomers Carolyn and Gene Shoemaker and David Levy spotted a comet (now called Shoemaker-Levy 9) on a collision course with Jupiter. Millions saw the incredible video of the massive explosions. There were 21 separate impacts, the largest of which was 600 times more powerful than the entire world’s nuclear weapons arsenal. The crater it left was 7,500 miles across, almost big enough to reach from the North Pole to Rio de Janeiro. A single impact like that would have wiped out life on Earth.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

The movies Armageddon and Deep Impact followed in the next few years, along with a couple of ultimately false alerts from the astronomy community about possible near-Earth objects headed our way that got huge press coverage. In the past 10 years, funding for NASA’s work to spot objects that might collide with Earth has gone from practically nothing to more than $20 million a year, and we’ve located more than 90 percent of the big ones that could do serious damage. (Good news: None are headed our way.) Furthermore, the United Nations has created a formal committee to coordinate global efforts to find these objects and plan for what to do if one is found to be heading toward us.

What happened with asteroids is an example of what cognitive psychologists call the Availability Heuristic, a phenomenon whereby we tend to pay more attention to, and worry more about, matters that readily come to mind. Here’s an example: Does the letter r appear more frequently as the first letter in words, or the third? As you search through the words you know to figure this out, the first letter is the first thing that comes to mind. As a result, most people say r is more common as a first letter in words, but in fact it is more common as the third.  The effect is compounded when strong emotions, like fear, are brought into play. Emotionally powerful experiences burn more deeply into our memories and are more readily summoned, and the speed and power of that recall give those memories disproportionate influence on our perceptions.

“We had ‘the giggle factor’ when it was mentioned that these objects could be dangerous and could be looked for.”

It’s only in the past few years that the idea of Earth being struck by an asteroid or comet has become salient. “Salience makes all the difference,” Yeomans says. Now, “as soon as the public actually sees the risk and gets interested, they notify their congressmen, and they get interested, and suddenly the policymakers are taking action. We don’t have trouble getting people to care anymore.” (Occasional spectacular events like the fireball over Chelyabinsk, Russia, in February—just a week after the near-miss flyby of Asteroid 2012 DA14—don’t hurt, either.)

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

The rise of disaster salience is not limited to near-Earth objects. Natural disasters are becoming more frequent, in part because there are more climate-related disasters, and, due to rising population density, the consequences are more severe. In addition, the modern era of 24/7 media coverage makes every disaster even more widely known. Ebel says that’s a big part of why earthquake-preparedness policies are the most advanced that he’s seen in the nearly 40 years he’s been involved with the issue. “Now, because of camera technology and the ability to stream video over the Internet, there are so many more dramatic pictures than would have been available. It lets us experience things viscerally. You could watch the [2011] tsunami moving inland in Japan. It heightens peoples’ interest.”

Howard Kunreuther, Co-Director of the Risk Management and Decision Processes Center at the Wharton School of Economics at the University of Pennsylvania, has advised governments and companies around the world on disaster planning. Paradoxically, Kunreuther observes, “I’m optimistic that things will get sufficiently worse that, unfortunately, people will pay more attention.” He says an informal poll of top corporate leaders found precisely that. “They all tell us, ‘We pay attention to these things now, now that they’ve happened.’” Similarly, Ed Fratto, Executive Director of the Northeast States Emergency Consortium, which brings together top emergency-planning officials from several Northeast states, observes that, “Governors are now focused on emergency management. There have been more disasters; they appear to be happening more frequently, affecting states that haven’t been impacted in the past. They seem to be more severe.”

The fact remains that no matter how intelligently policymakers develop their predictions and plans for these events, we are stuck with a brain that is not the rational computer we think it is. Jordan is frank about what happens when policymakers are handed numbers that are not certain and don’t imply imminent risk: “As far as I can tell, people look at the numbers and then throw them away and make decisions based on what their gut tells them.” Most people don’t do basic probability arithmetic very well, including those who are well-educated, according to a recent study1. At the same time, we are getting better at planning for low-probability, high-consequence events, and we are taking them more seriously—witness New York’s recent commitment to a $20 billion flood-protection program. And for this, we can thank past disasters. 

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

David Ropeik is a consultant in risk perception and risk communication who also teaches an environmental management course at Harvard Extension School. He’s the author of How Risky Is It Really? Why Our Fears Don’t Always Match the Facts, and his blog, Risk: Reason and Reality, appears at BigThink.com.


This article has been revised to reflect the following correction:

This article initially reported that the San Andreas Fault moves 33 to 37 centimeters a year, whereas it actually moves 33 to 37 millimeters a year. 

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .
close-icon Enjoy unlimited Nautilus articles, ad-free, for less than $5/month. Join now

! There is not an active subscription associated with that email address.

Subscribe to continue reading.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.