Featured Product
This Week in Quality Digest Live
Quality Insider Features
Master Gage and Tool Co.
Why it matters for accurate measurements
Ian Wright
MIT and ETH Zurich engineers use computer vision to help adjust material deposition rates in real time
Scott A. Hindle
Part 4 of our series on SPC in the digital era
Etienne Nichols
It’s not the job that’s the problem. It’s the tools you have to do it with.
Lee Simmons
Lessons from a deep dive into 30 years of NFL and NBA management turnover

More Features

Quality Insider News
Exploring how a high-altitude electromagnetic pulse works
High-capacity solution using TSMC’s 3DFabric technologies
EcoBell paints plastic parts with minimal material consumption
August 2023 US consumption totaled $219.2 million
New KMR-Mx Series video inspection system to be introduced at the show
Modern manufacturing execution software is integral for companies looking to achieve digital maturity
Study of intelligent noise reduction in pediatric study
Results are high print quality, increased throughput

More News

Davis Balestracci

Quality Insider

The Final Common Cause Strategy

‘Statistical control’ (common cause only) is a major achievement

Published: Friday, December 14, 2012 - 15:58

Previously I discussed three common cause strategies (links below) that help to expose all existing, underlying special causes of variation. They also provide necessary insight into how the current process came to be and allow construction of a baseline for assessing the effects of an intervention.

Common cause strategy No. 1: Exhaust in-house data in “Wasting Time With Vague Solutions, Part 2
Common cause strategy No. 2: Study the current process in “Wasting Time With Vague Solutions, Part 3
Common cause strategy No. 3: Process dissection in “Another Strategy for Determining Common Cause

W. Edwards Deming considered this a major achievement—getting a process to the point where only common cause is present. He felt that it was only after all lurking special causes were exposed and appropriately dealt with that improvement could begin.

Many Six Sigma efforts make the seductive mistake of using cost savings obtained in this manner to project it as an ongoing expected rate of return. To paraphrase something Deming once said about this, “All you’ve done is get your process to where it should have been in the first place. That is not improvement!” It is just a one-time savings, and that rate of improvement will not necessarily continue.

Using the first three strategies will help to optimize the current process to its fullest capable extent. If the process remains incapable of meeting customer needs, and some deeper, tolerated systemic root causes still have not been addressed, the next strategy is now appropriate to evaluate a fundamental redesign of the process to achieve desired outcomes.

Common cause strategy No. 4: Designed experimentation

The final common cause strategy is designed experimentation. This involves making a major, fundamental structural change in the way a process is performed, then measuring whether it is indeed an improvement. It also involves even more disturbance to the daily work culture than process dissection.

Designed experimentation should be used only after careful consideration and good planning. It is very common and tempting for teams to jump immediately to this step. The project becomes, in essence, implementing a “known” solution based on anecdotes or a recent (sanitized) journal article, or someone’s (sanitized) conference presentation. This is symptomatic of a lack of a clear problem definition. It’s a “vague solution to a vague problem,” which runs the danger of further compounding this error due to:
• Poor understanding of how your process really works
• Lack of good baseline data
• The presence of unexposed special causes
• Reliance on poorly planned data collection

 

These have very serious potential ramifications in the subsequent evaluation and prediction from its results: It will be no better than looking into a crystal ball.

If used initially, this approach naively begins the challenging problem-solving journey with the most “aggressive” strategy. Implicit in a designed experiment is major disturbance to a work culture that thrives on predictability. Especially if it is unannounced and perceived as being forced on a work culture, it will also inhibit cooperation and understanding from the department(s) involved in the experiment.

Think again about your credibility. To summarize the common-cause strategy journey, flowcharts and data collections allow people to focus on the significant improvement opportunities. They also ensure the right solution is applied to the right process. The team will gain substantial political credibility if it:
• Considers the work culture’s feelings
• Respects the use of people’s time during the project or experiment
• Demonstrates its competency in the improvement process
• Involves members of the work culture at the appropriate times

 

Despite the best of designs, there are the logistics of actually conducting the study—with its inherent danger of lurking human variation. Candidates for solutions should be generated, prioritized, and initially tested on a small scale using the good data principles described in my columns, “Four Data Processes, Eight Questions, Part 1” and “Four Data Processes, Eight Questions, Part 2.”

One always should pilot any proposed solution on a focused, small scale first. It is not desirable to put a large work environment through a major disturbance. It will create unintended variation, additional confusion, and cloud the results. A pilot will expose the lurking unexpected problems that compromise the experimentation process itself—as well as lurking cultural “sabotage.”

The question, “How will we know if things are better?” must be answered in the planning of the experiment and then tested with appropriate data collection (which might even initially include a test of the data collection process).

So…“How will we know if things are better?” (statistically)

If you have proceeded to this stage via strategies one through three, you should have some baseline data—hopefully a run chart—describing the problem. If there is a good baseline, the run chart can continue and the standard statistical rules used to determine whether the desired, favorable special cause was created.

In the case of a study designed specifically to move the baseline in a desired direction, one can relax the standard “eight-in-a-row” all above or below the median rule to “six-in-a-row.” The “six successive increases or decreases” rule (trend rule) can now be relaxed to “four.”

Using a control chart, the “two-out-of-three consecutive data points between two and three standard deviations” rule is many times very useful, as is the “four-out-of-five consecutive data points between one and three standard deviations” rule. These are in addition to the more obvious rules of a data point going outside the process common-cause limits or getting a significant moving range immediately on implementation.

I hope this journey into common cause strategies has made you aware of:
• The need to be wary of “vague solutions to vague problems”
• The “human variation” and important cultural factors in improvement logistics
• The need for a good baseline estimate (and clear definition!) of the problem

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.