Featured Video
This Week in Quality Digest Live
Quality Insider Features
Quality Digest
Automation and job loss
Harish Jose
The dangers of misapplying linearity
James daSilva
Like it or not, these are the good times
Chad Kymal
A single set of FMEA requirements will ease the burden on suppliers
Trevor Blumenau
Inexpensive wireless pick-to-light systems put warehouse productivity in reach for everyone

More Features

Quality Insider News
Management's role in improving work climate and culture
Helps manufacturers by focusing on problems and problem resolution in real time
Work with and learn from some of the nation’s best people and organizations
Cricket Media and IEEE team up to launch TryEngineering Together
125 strategies to achieve maximum confidence, clarity, certainty, and creativity
More effective and less expensive than heavy-zinc galvanize

More News

Donald J. Wheeler

Quality Insider

The Secret of Process Adjustment

Unneeded adjustments always make things worse

Published: Tuesday, January 29, 2013 - 13:30

When your process outcomes are not what you expect them to be it is common to adjust the process. This is not always appropriate. To understand when adjustments are appropriate, and when they are inappropriate, we will need to learn how to distinguish between the noise contained within the data and a signal that an adjustment is needed. Both the problem and the solution will be illustrated by an example from my own experience.

Plant A

Plant A was operated using the values delivered by the in-house lab. The lab director wanted to deliver good values, so he faithfully followed the recommendation from the manufacturer of one of his key analytical instruments and recalibrated that instrument every day. This recalibration used a known industrial standard.

If the value the lab director obtained did not match the accepted value for the standard, he would adjust the instrument by an amount equal to the difference between the accepted value and the observed value. Since he ended up making an adjustment more than 80 percent of the time, he was convinced that this recalibration was both necessary and correct. Figure 1 shows the observed values for 100 consecutive tests of the known standard for Plant A.


Figure 1: XmR chart for 100 measurements of a known standard at Plant A

This XmR chart shows a reasonably consistent and predictable process. During this 100-day period, the process was adjusted 82 times and was manually recalibrated six times. Because the accepted value for the known standard was 0.310, the lab director felt quite content that his lab was delivering unbiased, high-quality measurements to the plant.

Just how good are his observations? Because the chart in figure 1 shows repeated measurements of the same standard, the moving ranges may be used to characterize the uncertainties in these measurements. The average moving range is 0.00296 units, so when we divide by the appropriate d2 bias correction factor of 1.128, we obtain an estimate of the standard deviation of the measurement system:

This latter value means that a single measurement will err by 0.0018 units or more at least half the time. Although these measurements are being recorded to 0.001 units, they are essentially good to the nearest 0.002 units, which means they are almost, but not quite, as good as they look.

Plant B

Plant B is a competitor of Plant A. The in-house lab at Plant B uses an analytical instrument that is the exact same make and model as the analytical instrument used at Plant A. Plant B also tests the same industrial standard on a daily basis as Plant A does. However, this is where the similarities end. Rather than adjust the instrument based upon each reading, the lab director at Plant B placed each observed value for the standard on a consistency chart.

A consistency chart for a known standard is an XmR chart where the central line for X is set at the accepted value for the known standard. As long as this chart shows no evidence that the analytical instrument is off-target, no adjustments are needed, and so none are made. On those rare occasions when an observed value fell outside the limits on this X chart, an appropriate adjustment was made.


Figure 2: XmR chart for 100 measurements of a known standard at Plant B

Figure 2 shows 100 successive tests of the known standard at Plant B. No adjustments were required during this period. The average of 0.3101 units is essentially the same as the accepted value of 0.3100. The average moving range of 0.00200 units results in an estimate of the standard deviation of the measurement system of:

This probable error value means that the observed values were essentially good to the nearest 0.001 unit. Since they were recorded to the nearest 0.001 unit they were as good as they looked. By filtering out the noise in the data rather than treating each value as a signal, the lab director at Plant B reduced the work load in his lab by avoiding all the adjustments and recalibrations. At the same time he delivered higher quality data using the same instrument.

But the real pay-off for the consistency chart was not in the lab. In the words of the lab director for Plant B, “You don’t know how much grief you’ve saved us. When we started using the consistency chart for the standard, suddenly the whole plant started running better!”

The difference

The adjustment methodology used by Plant A is known as a proportional controller, or P-controller, with no deadband. It makes an implicit assumption that the data contain no noise. This is equivalent to assuming that two numbers that are not the same are different. Unfortunately, this is rarely true. In this world, numbers that are not the same are not necessarily different. As a result of interpreting each value as a signal, Plant A made many needless adjustments. These needless adjustments merely added to the variation in the measurements and degraded the quality of the reported values. This can be seen in figure 3 which shows the charts from figure 1 and 2 side by side.


Figure 3: Charts for Plant A and Plant B

In addition, the almost daily changes in the measurement process represented by Plant A’s adjustments created chaos in production. Following each adjustment made in the lab, who knows how many adjustments were made in production. When they stopped using a rubber ruler in Plant B the production department found that everything went more smoothly. Variation always creates costs. Excess variation creates unnecessary costs.

As seen by the example of Plant B, this degradation in the quality of the measurements was avoidable. The secret of how to avoid unnecessary adjustments is the central secret of all of data analysis. Since all data contain noise, it is imperative that you learn how to filter out the probable noise before you react as if you have found a potential signal. And the simplest, most robust noise filter is a process behavior chart. Any attempt to start interpreting your values before you have filtered out the noise will result in your being misled by the noise.

But why didn’t Plant A’s chart show the adjustments as signals? As shown in my January 2013 column, “But the Limits Are Too Wide!” when the signals become ubiquitous, they look like routine variation to the computations and end up inflating the limits.

As I showed in my March 2011 column, “Three Questions for Success,” a process behavior chart is an operational definition of how to get the most out of your process. It defines what a process is capable of doing when it is operated up to its full potential. It provides a way of moving a process toward its full potential, and it provides a way of judging how close to full potential your process is currently operating.

When applied to repeated measurements of a standard, an XmR chart is known as a consistency chart because it can be used to assess the consistency of a measurement procedure. With a known standard, the central line for the X chart is placed at the accepted value for the known standard, and any signals are taken as evidence of a bias in the measurements or a change in the measurement process. In the case of an unknown standard, or with a biased measurement system, you may use the average value of the observed values as the central line for the X chart. Here signals will indicate a change in the measurement system.

Upon detecting a change it is appropriate to take action. Actions taken in the absence of such evidence are likely to be needless adjustments, and as we have seen, needless adjustments are not benign. They inevitably increase costs, waste effort, and create grief.

Discuss

About The Author

Donald J. Wheeler’s picture

Donald J. Wheeler

Dr. Donald J. Wheeler is a Fellow of both the American Statistical Association and the American Society for Quality, and is the recipient of the 2010 Deming Medal. As the author of 25 books and hundreds of articles, he is one of the leading authorities on statistical process control and applied data analysis. Find out more about Dr. Wheeler’s books at www.spcpress.com.

Dr. Wheeler welcomes your questions. You can contact him at djwheeler@spcpress.com

Comments

simple question

I am not certain where 0.675 is coming from in the probable error calculcation for one point.  Can someone address this?  Thanks!

0.675

It's in a number of Dr. Wheeler's books. The simplest explanation is that it's the middle 50% of the normal distribution. Errors will exceed 0.675*s (or "sigma hat") about half the time; in other words, will fall within xbar plus or minus 0.675 sigma hat about half the time. More detailed explanations, and the rationale for using Probable Error, may be found in Understanding Statistical Process Control (3rd Ed.), EMP III Using Imperfect Data, or Evaluating the Measurement Process.

Plant B always wins

Another excellent article from Dr. Wheeler.  Our company experiences almost the same situation.  We have an analytical instrument that measures a property of our product that is critical to quality.  Many years ago the instrument was re-calibrated on a time based frequency - whether it needed it or not.  We adopted the Plant B approach and ran weekly calibration checks with a known standard and plotted the results on a XmR chart.  We found that the instrument retains its calibration for much longer than anticipated.  When a calibration check resulted in a signal, we first inspected the instrument and standard for special causes.  In many cases, the instrument was simply contaminated or the standard was damaged.  Correcting these problems and rerunning a calibration check often showed that the instrument was indeed still in calibration.

Our customers noticed the improved consistency in our product which helped us increase market share.

Institutionalized Rule 2

Great article, great example. This practice is just unabashed, institutionalized Rule 2 of the Funnel. I agree that it can be difficult for many organizations to accept, because it's "just common sense" that if one number is different from another number, something must have caused it to be different. That kind of mechanistic, deterministic mindset drives a lot of bad behavior.

I used to enjoy teaching this concept to Marines, when I was teaching SPC in the Department of the Navy. Marines just "got it," almost immediately, because they knew that you don't adjust your sights based on the last shot fired, you adjust based on the group. Once you get the group centered on target, you leave it alone.

As Seen In Action

Your column described a phenomenon I observed (with shock) at one of my previous employers. They used a pH-conductivity meter to make measurements. Everyday, before the first production measurement, they would calibrate the meter with a known standard. Because the manufacturer recommended this procedure, and because it had been done for as long as anyone could remember, my efforts to educate them on the wrongness of their approach fell flat.

Lesson 1: Habits are difficult to overcome.

Lesson 2: Manufacturer's recommendations aren't always right.

Even though I no longer work there, I have no doubt that the practice of calibrating the meter everyday still goes on.

(Shrikant Kalegaonkar, twitter: @shrikale, LinkedIn: http://www.linkedin.com/in/shrikale/)