Featured Product
This Week in Quality Digest Live
Risk Management Features
Peter Nathanial
Lessons from finance
William A. Levinson
Quality and manufacturing professionals are in the best position to eradicate inflationary waste
Stephanie Ojeda
Solve problems better, faster, and with greater confidence in your operations
Mark Graban
Focus on psychological safety instead
Andrey Koptelov
Managers often face ethical dilemmas when balancing financial returns and the well-being of employees

More Features

Risk Management News
Greater accuracy in under 3 seconds of inspection time
New offer mitigates cyberthreats for remote access and connectivity
Boosting productivity and efficient inspections in confined and hard-to-access places
Keeping up with industry demands while protecting workers
Users can define product platforms while increasing quality, lowering cost, and shortening time to market
With coupling capacitor approach that eliminates the need for an external sensor
Research commissioned by the Aerospace & Defense PLM Action Group with Eurostep and leading PLM providers
ASQ will address absence of internationally recognized ESG benchmarks

More News

Stanford News Service

Risk Management

Black Swans, Perfect Storms: Lame Excuses for Poor Risk Management

An engineering risk management approach works when statistics aren’t enough

Published: Tuesday, November 20, 2012 - 14:26

The terms “black swan” and “perfect storm” have become common for describing disasters ranging from the 2008 financial meltdown to the Sept. 11, 2001, terrorist attacks. But according to Elisabeth Paté-Cornell, a Stanford professor of management science and engineering, people in government and industry are using these terms too liberally in the aftermath of a disaster as an excuse for poor planning.

Her research, published in the November 2012 issue of the journal, Risk Analysis, suggests that other fields could borrow risk analysis strategies used in engineering to make better management decisions, even after experiencing once-in-a-blue-moon events where statistics are scant, unreliable, or nonexistent.

A black swan

Elisabeth Paté-Cornell argues that a true “black swan”—an event that is impossible to imagine because we’ve known nothing like it in the past—is extremely rare.

Paté-Cornell argues that a true “black swan”—an event that is impossible to imagine because we’ve known nothing like it in the past—is extremely rare. Usually before hazards emerge there are warning signs that can be monitored to guide quick risk-management responses.

For example, the attacks of 9/11 were not black swans, she says The FBI knew that questionable people were taking flying lessons on large aircraft. A group of terrorists seemed to have had a similar plan in 1994, when it took over a Paris-bound Air France aircraft in Algiers, Algeria.

Similarly, she believes the risk of a “perfect storm,” where a situation is drastically changed by a combination of unfavorable circumstances, can be assessed in a systematic way before the event because even though such conjunctions are rare, the individual circumstances that cause the event have been observed in the past.

“Risk analysis is not about predicting anything before it happens; it’s just giving the probability of various scenarios,” Paté-Cornell says. Systematically exploring those scenarios can help companies and regulators make smarter decisions before an event.

Think like an engineer

An engineering risk analyst thinks in terms of systems, their functional components, and their dependencies, says Paté-Cornell. For example, generators, turbines, water pumps, and safety valves may all contribute to making a system work. A risk analyst must first understand how that system works as a whole before determining how it can fail.

The same applies to medical, financial, or ecological systems. A systematic approach is also relevant to human aspects of risk analysis.

“Some argue that in engineering you have hard data about hard systems and hard architectures, but as soon as you involve human beings, you cannot apply the same methods due to the uncertainties of human error,” says Paté-Cornell. “I do not believe this is true.”

In fact, Paté-Cornell and her colleagues have long been incorporating “soft” elements into their systems analysis to calculate the probability of human error. They look at all the people with access to the system and factor in any available information about past behaviors, training, and skills.

Paté-Cornell has found that human errors, far from being unpredictable, are often rooted in the way an organization is managed. “We look at how the management has trained, informed, and given incentives to people to do what they do and assign risk based on those assessments,” she says.

A proven approach

Paté-Cornell has successfully applied this approach in the finance field, when she estimated the probability that an insurance company would fail given its age and its size. She says companies funded her research because they needed forward-looking models generally not provided by their financial analysts. Traditional financial analysis, she says, is based on evaluating existing statistical data about past events. In her view, analysts can better anticipate market failures by recognizing precursors and warning signs, and factoring them into a systemic probabilistic analysis.

Paté-Cornell says the same approach is useful for calculating patient risk when medical specialists must make decisions with limited statistical data.

She has used systems analysis to assess data about anesthesia accidents endangering patients’ lives. Based on her research, she suggested retraining and recertification procedures for anesthesiologists to make their system safer.

“Lots of people don’t like probability because they don’t understand it,” says Paté-Cornell. “They think if they don’t have hard statistics, they cannot do a risk analysis. In fact, we generally do a system-based risk analysis because we do not have reliable statistics about the performance of the whole system.”

Article by Kelly Servick. First published in the Nov. 15, 2012, edition of Stanford News.

Discuss

About The Author

Stanford News Service’s picture

Stanford News Service

The Stanford News Service is part of Stanford University’s Office of University Communications. It provides assistance to reporters and disseminates much of the university’s news. It also serves as a liaison between scholars and media outlets. Stanford University is recognized as one of the world's leading research and teaching institutions.