Our PROMISE: Our ads will never cover up content.
Our children thank you.
William A. Levinson
Published: Monday, August 24, 2015 - 11:26 The requirement for risk-based thinking is among the most significant changes in ISO 9001:2015. Army Techniques Publication (ATP) 5-19, Risk Management is a public domain reference that supports this requirement. ATP 5-19 includes a risk assessment approach that is similar to failure mode and effects analysis (FMEA), but is considerably simpler. As seen in the table in figure 1, it uses only four severity and five occurrence (probability) ratings, and generates one of four hazard levels as the counterpart to the FMEA’s risk priority number (RPN). Unlike traditional FMEA, ATP 5-19 introduces the issue of exposure, which it describes as “the frequency and length of time personnel and equipment are subjected to a hazard or hazards. For example, given about 500 exposures, without proper controls, a harmful event will occur. Increased exposure—during a certain activity or over iterations of the activity—increases risk. An example of frequent occurrence is a heat injury during a battalion physical training run, with a category 5 heat index and nonacclimated soldiers.” Those of you familiar with FMEA would recognize that this corresponds to an FMEA occurrence rating of 10, which means “failure is almost inevitable.” Exposure, therefore, takes into account not only the individual chance of occurrence, which is the basis for an FMEA occurrence rating, but also the number of opportunities for the problem to happen. The result is simply the familiar np of the binomial distribution, in which p is the chance of occurrence and n is the number of opportunities. A smaller p is always better, but if n is sufficiently large, the occurrence of the problem can still be likely or even a near-certainty. In his article “Properties of Probability Models, Part 1,” which appeared in Quality Digest Daily on August 3, 2015, Donald Wheeler provides practical applications with which statistical practitioners are already familiar: “This fine-tuning is important because additional data are not generally going to be available and they need to get the most out of the limited amount of experimental data. Thus, the complexity and cost of most experiments will justify a fair amount of complexity in the analysis. Moreover, to avoid missing real signals within the experimental data, it is traditional to filter out only 95 percent of the probable noise.” The Type I, or alpha risk, of wrongly rejecting the null hypothesis is relatively high (5 percent), but we are exposed to this risk only once. A 5-percent false alarm risk would, on the other hand, be totally unacceptable for the statistical process control charts on which we plot dozens or hundreds of sample statistics. Gorur Sridhar, in his article, “Do You Want Six Sigma or Quality?” appearing in Quality Digest Daily on August 6, 2015, provides another perfect illustration of this concept: “Now imagine that you are admitted for a surgical procedure on your right eye. After the surgery, you wake up to find out that the wrong eye was operated on. Are you consoled by the hospital superintendent saying that mistakes do happen but the percentage is just way too small to be quantified?” Wrong-site surgery is a menace to human safety, and therefore justifies a 10 severity rating in an FMEA. Also suppose that the detection rating is 3 and the occurrence rating is 2 (1 in 150,000). The resulting RPN is 60, which is not particularly bad, although a 10 severity always demands attention. Here, however, is how the Army’s risk management process would handle it. The severity is “Catastrophic,“ which corresponds to FMEA severities of 9 and 10. In addition, even though in Sridhar’s example “The percentage is just way too small to be quantified”—in other words, a low individual probability of occurrence—the fact that millions of surgical procedures are performed annually makes it a near-certainty that wrong-site surgeries will be performed. ATP 5-19 would therefore assign an occurrence rating of “Frequent;” the worst possible rating despite the low individual chance of occurrence. The combination of Catastrophic severity with Frequent occurrence results in an “Extremely High” risk level. Returning to the table in figure 1, we see the difference between using only the individual chance of occurrence (“Seldom” or even “Unlikely”) and the exposure (n) in combination with the individual chance of occurrence (p). The first approach yields a “High” or “Medium” risk, noting that no hazard with a Catastrophic rating can have less than a Medium risk. The second approach tells us that the risk is Extremely High, and therefore demands urgent preventive action. This, in turn, is where Henry Ford’s “Can’t rather than don’t” principle becomes important. In his article, Sridhar adds: “In the hospital in this scenario, what is that one factor that differentiates between an error and a successful outcome? Perhaps it’s that extra caution and precaution that every individual has to exercise in the course of discharging his or her duties.” However, as seen in the 1920 article, “How Henry Ford saves men and money,“ appearing in the National Safety News journal, Ford realized that any task that relies on worker vigilance to avoid an accident or defect will eventually produce one. Workers were presumably warned repeatedly “Don’t let your fingers get into the path of sewing machine needles,” but three or four minor injuries nonetheless occurred daily. The injury rate dropped to zero when guards were installed, as seen in figure 2, so “You can’t put your finger into the path of the needle.” ATP 5-19 similarly cites the difference between administrative controls, which require people to be “careful,” and engineering controls that make problems impossible: “The preferred method is to control the hazard at its source, through engineering. Engineering is preferable because, unlike other controls, it generally focuses on the individual who is exposed. The concept behind engineering controls is that, to the extent feasible, engineers or Army units design the equipment or work environment and the task to eliminate hazards or to reduce exposure.” Engineering controls therefore support “Can’t rather than don’t,” and are forms of error-proofing. When routine performance of a job generates thousands or millions of opportunities for error, “Can’t rather than don’t” is the only way to ensure that trouble will never occur. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, William A. Levinson, P.E., FASQ, CQE, CMQOE, is the principal of Levinson Productivity Systems P.C. and the author of the book The Expanded and Annotated My Life and Work: Henry Ford’s Universal Code for World-Class Success (Productivity Press, 2013).How Exposure Affects Risk
Embracing ‘can’t rather than don’t’
Figure 1: ATP 5-19 risk assessment matrix. Click here for larger image.High exposure levels require error-proofing
Figure 2: Can’t rather than don’t
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
William A. Levinson
© 2022 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.