That’s fake news. Real news COSTS. Please turn off your ad blocker for our web site.
Our PROMISE: Our ads will never cover up content.
Stanford News Service
Published: Tuesday, November 20, 2012 - 14:26 The terms “black swan” and “perfect storm” have become common for describing disasters ranging from the 2008 financial meltdown to the Sept. 11, 2001, terrorist attacks. But according to Elisabeth Paté-Cornell, a Stanford professor of management science and engineering, people in government and industry are using these terms too liberally in the aftermath of a disaster as an excuse for poor planning.
Her research, published in the November 2012 issue of the journal, Risk Analysis, suggests that other fields could borrow risk analysis strategies used in engineering to make better management decisions, even after experiencing once-in-a-blue-moon events where statistics are scant, unreliable, or nonexistent. Elisabeth Paté-Cornell argues that a true “black swan”—an event that is impossible to imagine because we’ve known nothing like it in the past—is extremely rare. Paté-Cornell argues that a true “black swan”—an event that is impossible to imagine because we’ve known nothing like it in the past—is extremely rare. Usually before hazards emerge there are warning signs that can be monitored to guide quick risk-management responses. For example, the attacks of 9/11 were not black swans, she says The FBI knew that questionable people were taking flying lessons on large aircraft. A group of terrorists seemed to have had a similar plan in 1994, when it took over a Paris-bound Air France aircraft in Algiers, Algeria. Similarly, she believes the risk of a “perfect storm,” where a situation is drastically changed by a combination of unfavorable circumstances, can be assessed in a systematic way before the event because even though such conjunctions are rare, the individual circumstances that cause the event have been observed in the past. “Risk analysis is not about predicting anything before it happens; it’s just giving the probability of various scenarios,” Paté-Cornell says. Systematically exploring those scenarios can help companies and regulators make smarter decisions before an event. An engineering risk analyst thinks in terms of systems, their functional components, and their dependencies, says Paté-Cornell. For example, generators, turbines, water pumps, and safety valves may all contribute to making a system work. A risk analyst must first understand how that system works as a whole before determining how it can fail. The same applies to medical, financial, or ecological systems. A systematic approach is also relevant to human aspects of risk analysis. “Some argue that in engineering you have hard data about hard systems and hard architectures, but as soon as you involve human beings, you cannot apply the same methods due to the uncertainties of human error,” says Paté-Cornell. “I do not believe this is true.” In fact, Paté-Cornell and her colleagues have long been incorporating “soft” elements into their systems analysis to calculate the probability of human error. They look at all the people with access to the system and factor in any available information about past behaviors, training, and skills. Paté-Cornell has found that human errors, far from being unpredictable, are often rooted in the way an organization is managed. “We look at how the management has trained, informed, and given incentives to people to do what they do and assign risk based on those assessments,” she says. Paté-Cornell has successfully applied this approach in the finance field, when she estimated the probability that an insurance company would fail given its age and its size. She says companies funded her research because they needed forward-looking models generally not provided by their financial analysts. Traditional financial analysis, she says, is based on evaluating existing statistical data about past events. In her view, analysts can better anticipate market failures by recognizing precursors and warning signs, and factoring them into a systemic probabilistic analysis. Paté-Cornell says the same approach is useful for calculating patient risk when medical specialists must make decisions with limited statistical data. She has used systems analysis to assess data about anesthesia accidents endangering patients’ lives. Based on her research, she suggested retraining and recertification procedures for anesthesiologists to make their system safer. “Lots of people don’t like probability because they don’t understand it,” says Paté-Cornell. “They think if they don’t have hard statistics, they cannot do a risk analysis. In fact, we generally do a system-based risk analysis because we do not have reliable statistics about the performance of the whole system.” Article by Kelly Servick. First published in the Nov. 15, 2012, edition of Stanford News. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, The Stanford News Service is part of Stanford University’s Office of University Communications. It provides assistance to reporters and disseminates much of the university’s news. It also serves as a liaison between scholars and media outlets. Stanford University is recognized as one of the world's leading research and teaching institutions.Black Swans, Perfect Storms: Lame Excuses for Poor Risk Management
An engineering risk management approach works when statistics aren’t enough
Think like an engineer
A proven approach
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Stanford News Service
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.