Featured Product
This Week in Quality Digest Live
Risk Management Features
Oliver Binz
Better internal information systems help managers tell consumer demand from inflationary pressure
Steven I. Azizi
Take these steps to protect your employees and your company
Oliver Laasch
There’s unlikely to be a point of general stability anytime soon
Gleb Tsipursky
You shouldn’t trust your gut as a decision-maker: Here’s why.
NIST
NIST examines disinfection methods that could be critical in the future if PPE supply is low

More Features

Risk Management News
ISO 21434 automotive cybersecurity and implementing design and process FMEAs
Implementing a SIOP process can smooth supply spikes while improving cash flow and increasing profitability
Does your business’ security match up with competitors?
Prior to vote, IAF seeks industry feedback to understand the level of demand from businesses and regulators.
The acquisition targets the rapidly widening gap between quality data creation and leverage
Winter 2022 release of Reliance QMS focuses on usability, mobility, and actionable insights
Designed to offer a comprehensive safety solution for fleet vehicles and workforce personnel
A cybersecurity expert lays out crucial HR practices to amplify attack readiness for modern businesses
NordLocker study discloses industries at a heightened risk of ransomware attacks, with manufacturers taking a troubling second place

More News

James Lamprecht

Risk Management

Some Thoughts on FMEAs and Unknown Risks

Subjective probabilities are only possible when one deals with what is known

Published: Wednesday, January 15, 2014 - 10:21

Anyone who has done an online search using the terms “risk analysis,” “managing risk,” “risk management,” or any other variation will have discovered that the subject has been around for a long time and been covered by numerous authors. Still, the daunting challenge remains: How can one conduct process risk analysis without the help of a Ph.D. in statistics? 

FMEA fundamentals

A popular technique often invoked by various experts is failure mode and effects analysis (FMEA), developed several decades ago. This simple and controversial technique relies on the assignment of subjective ordinal numbers (usually using a 1–10 Likert-type scale) to estimate probabilities for three events:
• The difficulty (D) of detecting a failure
• The severity (S) of the failure
• The likelihood of occurrence (O) of the failure

These three subjectively estimated ordinal numbers are multiplied to “compute” risk priority numbers (RPNs) for various process steps. The RPNs are then ranked from highest to lowest, and the process steps with the highest RPNs are then analyzed to see how process improvements can be designed to help reduce the RPN—ideally to zero.

One of the criticisms of FMEA is that the ordinal values are subjectively assigned and are not based on factual or data-driven (i.e., objective) evidence. Unfortunately, data aren’t always available for each process step, and in such cases subjective estimation is the only means available. To help minimize the error in estimation, one can have several individuals, usually process engineers familiar with the process, provide their own subjective values to estimate probabilities; the results can either be averaged, or the low and high values can be eliminated, as is done in some Olympic scoring events.

Still, a more serious objection has been raised by numerous authors, including Donald J. Wheeler, who points out that it is a mathematical absurdity to multiply ordinal numbers to generate a RPNs. The multiplication of numbers can only be performed on ratio-type numbers; that is, numbers for which an absolute zero point has been defined. For suggestions on how to improve scoring, refer to Wheeler’s “Problems With Risk Priority Numbers.” 

Bayesian statistics

The assignment of subjective probabilities, or the estimation of probabilities based on subjective opinions, is not limited to FMEAs. In fact, there is a vast field of statistics known as Bayesian statistics that relies on subjective a priori probabilities (i.e., before the fact) to compute more refined estimates of a posteriori (after the fact, or after new estimates are provided) probabilities. The Bayesian statistical approach is widely defended by Bayesian statisticians, who argue that it is the only valid method in any decision-making process.

Given the popularity of risk analysis and its obvious association with decision analysis, it is in fact surprising to see that Bayesian statistics has not yet been mentioned when commenting on ISO 9001:2015. The Bayesian statistical method isn’t complicated, but it is, in my opinion, cumbersome. For simple cases, it often but not always proves the obvious in a roundabout and tedious computational process based on ratios of conditional probabilities that invariably lead to what is known as Bayes’ theorem.

No doubt Bayesian statisticians will seriously object to my dismissive assessment, and I must admit that Bayesian statistics is valuable in medical diagnosis (as well as other fields) to estimate the probability of having a particular medical condition given certain preexisting conditions. Regardless, I don’t foresee industrialists embracing Bayesian statistics in the near future.

Risks of risk analysis

But there are other difficulties associated with any risk analysis study, which have to do with what is referred to as “known-unknown” and “unknown-unknown.” There are many cases of known-unknown—in other words, cases where we can enumerate or list what is unknown. For example, we may suspect that humidity or even barometric pressure might affect a particular process, but we do not yet (or may never) know how these variables may or may not affect a particular process. But even in less constraining cases where one is aware that a risk is known to exist but its probability of occurrence is either unknown or is grossly underestimated, corrective measures are often not implemented.

An example of a known but apparently underestimated risk was the recent security debacle suffered by the department store Target—a breech in security that exposed more than 40 million customers to cyber-criminals. The possibility of having cyber-criminals siphon information at various transaction points was known but was either wrongly assumed (or subjectively assessed) to be a low risk. Or it was simply ignored as being a serious threat.

Why is the United States the only major economy to use magnetic card technology developed during the 1960s, when a better technology has long been available? Why not convert, as many countries did more than 12 years ago, to smart credit cards loaded with a digital chip? In Europe, the use of such a technology has helped reduce the risk of cyber-theft to zero.

Unknown unknowns

The assignment of subjective probabilities is only possible when one deals with what is known, but how does one analyze the unknown-unknown events that have been popularized by financial analysts as “black swans,” also known as “extreme outliers?” An unknown-unknown would include all the variables of whose existence we are unaware, let alone how they might affect a process.

How does one assess risks for unknown-unknown (black swans)? You simply can’t, and yet these are the very risk events that could wipe you out in a few minutes or hours. It is impossible to give examples of unknown-unknown except perhaps by invoking the imaginary world of science fiction.

Unknown variables or events, those infamous or notorious black swans, could never be considered, let alone analyzed during the most rigorous risk analysis. So what is the point? Well, I suppose one can always estimate the risk of what you think you know and hope that the probability of the unknown ever occurring is very, very small, or at least will not occur during your shift.

This article was first published Jan. 12, 2014, in CERM RISK INSIGHTS.

Discuss

About The Author

James Lamprecht’s picture

James Lamprecht

James Lamprecht is a management consultant, statistician, teacher, and Six Sigma Master Black Belt. He has consulted in Europe, Canada, and Latin America, audited more than 100 companies worldwide, and conducted hundreds of seminars and classes in applied industrial statistics, ISO 9001, and Six Sigma. He has authored 11 books including Interpreting ISO 9001:2000 with Statistical Methodology (ASQ Quality Press, 2001), Applied Data Analysis for Process Improvement: A Practical Guide to Six Sigma Black Belt Statistics (ASQ Quality Press, 2005) and Dare To Be Different: Reflections on Certain Business Practices (ASQ Quality Press, 2009). Lamprecht received his doctorate from UCLA.