Our PROMISE: Our ads will never cover up content.
Our children thank you.
John Flaig
Published: Monday, February 24, 2014 - 15:02 Engineers have used safety margins for centuries to protect their companies and customers from the consequences of product degradation and failure. Sometimes the safety margins are fairly obvious (e.g., maximum-load limits posted in elevators), and other times they’re not.
Design margins are often hidden from view. For example, some computers have cooling fans that are used to reduce the CPU’s failure rate. A heat sensor in the computer automatically increases the fan’s speed, which increases air flow and cools the CPU, thus reducing the failure rate of this expensive computer component. In quality engineering there are some popular statistics that have less-than-obvious safety margins included. Consider the process capability index Cpk, which is defined for a stable normal process as: Cpk = min{(USL – mean), (mean – LSL)}/ 3 sigma where sigma is the within the rational subgroup estimate of variation. Because the minimum is used, the metric reflects only half the distribution, and hence the fraction nonconforming for the distribution can range from p to 2p, where p corresponds to the tail area of the minimum distance of the mean to the specification limits for a normal distribution. And because p is the worst case for half of the distribution, Cpk has a hidden safety margin in terms of the possible fraction nonconforming that ranges from zero to p. The fraction nonconforming is considered by many to be an empirical measure of process incapability; therefore, Cpk is a nonspecific but conservative measure of process capability. Another example of a hidden safety margin in quality engineering is the assumed ±1.5 sigma shift that is part of the Six Sigma methodology. Of course, it’s not called a safety margin, but a rose by any other name is still a rose. According to historical anecdotes, engineers at Motorola observed that their process means tended to drift around by about ±1.5 sigma. They wanted to build in a safety margin to compensate for this variation in case other processes exhibited similar instability. At least this is the story that we are left with today. However, I think the real story might be a little different. Let’s assume we have a stable process (i.e., one that is not drifting around by ±1.5 sigma) given by X = {x1, x2, x3, …, xN} and further assume that the xi are approximately normally distributed. The xi ~ N(µ, σ) can be mapped into N(0, 1) using the linear transform: zi = (xi – µ)/σ, which results in Z ~ N(0, 1). Forming moving subgroups of Z of size n = 2, the estimated sigma of the mean (m) is given by: Where, for the standard normal distribution: Then, Finally, the α = 0.1 two-sided 90-percent confidence interval for the mean µ is given by: However, this is for the Z distribution where Then, I think that this may be how the Motorola people arrived at the ±1.5 sigma shift. Now, I agree that it’s good engineering practice to specify a safety margin to mitigate risk, but I really think the type-1 error choice should be based on an analysis of the ramifications of failure rather than set at an arbitrary ±1.5 sigma. This decision is just like buying an insurance policy: You may not be paying enough, or you could be paying too much to protect yourself against your risk exposure. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, John J. Flaig, Ph.D., is a fellow of the American Society for Quality and is managing director of Applied Technology at www.e-at-usa.com, a training and consulting company. Flaig has given lectures and seminars in Europe, Asia, and throughout the United States. His special interests are in statistical process control, process capability analysis, supplier management, design of experiments, and process optimization. He was formerly a member of the Editorial Board of Quality Engineering, a journal of the ASQ, and associate editor of Quality Technology and Quantitative Management, a journal of the International Chinese Association of Quantitative Management.Is the 1.5 Sigma Shift an ‘Ill-Conceived Safety Margin?’
The hidden use of safety margins in quality engineering
, so for the original X distribution we have:
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
John Flaig
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Comments
Ill-used, anyway...
I've been doing some research for a paper on this subject. Michael Harry originally conceived the 1.5-sigma shift as a design margin. He told his engineers to run their simulations with all the critical component metrics shifted 1.5 sigma in the worst-case direction, to simulate the potential effects on the system of a worst-case tolerance stack nightmare. This makes a lot of sense, from a robust-design perspective.
What doesn't make sense is that for whatever reason, this idea ended up being extrapolated to an assumption that a process could somehow sustain an undetected 1.5-sigma shift indefinitely (or, at least, over a statistically stable production run of 1,000,000). That absurd claim is probably the worst of several fundamentally flawed assumptions for the "Process Sigma Table" in most Six Sigma training materials, and its claim that a process operating at "Six Sigma" levels of quality produce no more than 3.4 defects per million opportunities.