subscribe to ENR magazine subscribe
contact us
careers industry jobs
events events
Dodge Data & Analytics
ENR Logo
Web access will be provided
as part of your subscription.

Viewpoint: Why Uncertainty Is a Critical Factor in Designing for Disaster

Text size: A A

Recently, the media reported an ecclesiastical forecast of the end of the earth. Dubbed the “rapture,” most of us are thankful it was inaccurate. But the same source, likely using the same information updated with one piece of hard fact—the non-event—now forecasts the rapture for Oct. 21, 2011. This typifies the great uncertainty in forecasting extreme events.

Photo by Tom Sawyer for ENR
Damage in northern Japan following the tsunami.
----- Advertising -----

The past decade has seen a number of lesser but quite extreme events, including Hurricane Katrina, earthquakes in Haiti, Chile and New Zealand, and tsunami in Japan. Common to all are extraordinary natural forces, the inadequacy of preparations, great losses and shocked surprise.

Why are we surprised by powerful disasters? Events of similar intensity to the disasters cited have occurred before. In 1993, a tsunami hit the Japanese island of Hokkaido, west and just above the island of Honshu, whose east coast was devastated by a tsunami in March. The 1993 tsunami generated wave heights comparable to the 2011 event.

A recent ENR article by Senior Editor Tom Sawyer included a graphic based on geologic data that showed evidence of multiple large prehistoric tsunamis on the Oregon coast. On the U.S. East Coast, geologic data indicates a large tsunami washed over what is now New York City 2,300 years ago.

Of the three components of risk—the hazard, the reliability of systems to mitigate the hazard and the consequences of the event—hazard is the most difficult to quantify, especially with rare events. As ENR reported, scientists in Japan did not think a magnitude-9 earthquake would occur off Honshu's coast, thus the tsunami mitigation measures in place were designed for a lesser event. The consequences for the most tsunami-prepared country were devastating.

Hazards often involve processes operating on geologic scales of time and space. But we tend to think about them on the human scale. Too often we use relatively recent data records and extrapolate to the geological time scales. To simplify analysis, we lean on the assumption that past events are representative of future events. We assume the most severe events recorded—or even worse, the most severe events we remember—represent the worst nature has to offer. It should be no surprise that this approach has not served us well. Uncertainty in natural processes and our failure to estimate uncertainty have often led to surprise and dire consequences.


Ed Link
What do we do about it? First, we must understand the level of uncertainty that exists, which only can be accomplished with probabilistic analysis. Traditional deterministic methods won't do. It is also important to examine all sources of uncertainty, including flawed knowledge, luck and policy. Second, we must apply this information to reduce risk.

While risk and uncertainty are considered with respect to operational safety in the design of critical infrastructure—such as large dams, powerplants and industrial facilities—they are only now emerging as critical factors in the designs for natural hazard mitigation.

The new Hurricane Storm Damage Risk Reduction System in New Orleans is one of the first large-scale examples of applying calculated uncertainty to risk-mitigation design. Instead of using the statistically most-likely water levels for an event that has a 1% chance of occuring in any given year, or 100-year return period, the design criteria added water heights based on the quantified uncertainty—a more conservative design.

A second initiative to factor uncertainty into design is now using comprehensive risk assessment to determine where armoring would increase resilience of the levees and their ability to withstand overtopping. Levee erosion after overtopping was the primary cause of catastrophic breaching during Katrina. Resilient structures could have halved flooding and losses.

We need to acknowledge the uncertainty in natural hazard events and routinely estimate and apply uncertainty factors to the design and implementation of risk-reduction measures. The New Orleans example is a crude but welcome move. Does this cost more? Initially, yes. Over time, formally factoring in risk can bring enormous savings, not only monetarily but also in reducing major interruptions to society and the economy. There is rapture in that.

Dr. Lewis E. “Ed” Link is a research professor in the department of civil and environmental engineering at the University of Maryland. Contact him at


----- Advertising -----
  Blogs: ENR Staff   Blogs: Other Voices  
Critical Path: ENR's editors and bloggers deliver their insights, opinions, cool-headed analysis and hot-headed rantings
Project Leads/Pulse

Gives readers a glimpse of who is planning and constructing some of the largest projects throughout the U.S. Much information for pulse is derived from McGraw-Hill Construction Dodge.

For more information on a project in Pulse that has a DR#, or for general information on Dodge products and services, please visit our Website at

Information is provided on construction projects in following stages in each issue of ENR: Planning, Contracts/Bids/Proposals and Bid/Proposal Dates.

View all Project Leads/Pulse »

 Reader Comments:

Sign in to Comment

To write a comment about this story, please sign in. If this is your first time commenting on this site, you will be required to fill out a brief registration form. Your public username will be the beginning of the email address that you enter into the form (everything before the @ symbol). Other than that, none of the information that you enter will be publically displayed.

We welcome comments from all points of view. Off-topic or abusive comments, however, will be removed at the editors’ discretion.