Featured Product
This Week in Quality Digest Live
Quality Insider Features
Jonathan Griffin
New standard leads to smoother production in 3D printing
Anisur Rahman
ORNL finds scalable, sustainable approach
David Stevens
Tracking your assets is critical to patient safety
Richard Harpster
Good news? You are probably already doing it.
Adam Zewe
Researchers find the root cause of side-channel attacks that are easy to implement but difficult to detect

More Features

Quality Insider News
Making designs a physical reality with the know-how to make more
Sapphire XC will ship in late Q3 beginning with aerospace companies
Major ERP projects take six months longer than companies were told
Program inspires leaders to consider systems perspective for continuous improvement and innovation
Collaboration produces online software for collecting quality inspection data
Serving the needs of employers and educators
Powder reuse schemes affect medical device performance

More News

Paul Naysmith

Quality Insider

Seven Quality Tools of an Improvement Ninja, Part 3

The control chart

Published: Monday, March 25, 2013 - 13:12

Story update 4/3/2013: The author replaced his earlier chart example with an explanation of how to set up an Xbar chart.

Unlike the difficult "third album," the one that is supposed to be a real challenge following the first two musical productions, my third album in the Seven Quality Tools suite is quite easy to compose. Why? Because I'm fortunate in having access to the vast swathes of research that many statisticians have contributed about the control chart, which is the subject of this column.

If you haven't read part one or part two of my Improvement Ninja saga, they aren't necessary for understanding the drift of part three. Herein I continue my mission by covering the third tool on the list of seven quality tools, as recognized by the American Society for Quality (ASQ). I will warn most statisticians up front that I will hideously oversimplify this tool. Keep in mind this series is aimed at newly initiated quality professionals, or anyone who is interested in the subject but hasn't necessarily been through the Rosetta Stone Quality Language training system.

For those not in know, the seven basic quality tools are as follows:
1. Cause and effect diagram
2. Check sheet
3. Control charts
4. Histogram
5. Pareto chart
6. Scatter diagram
7. Stratification

Tool 3: The control chart

The control chart has its first recorded origins in the distant past of the 1920s and 1930s, when American hero Walter A. Shewhart, making telephones at the time, demonstrated to his employers that reducing variation would increase quality. One of the tools he used to achieve this was the control chart. I often reflect that this breakthrough in industrial performance assessment was the evolutionary starting point for our very comfortable lives nearly 100 years later.

In some circles, Shewhart's contribution to society is recognized, albeit rarely appreciated. I would strongly advise that if you haven't yet researched Shewhart's work or remarkable life, you do so. You will see that over time, Shewhart has unquestionably shaped many of today's brilliant quality thinkers. W. Edwards Deming, in his preface to Shewhart's republished work in Statistical Method from the Viewpoint of Quality Control (Dover Publications, 1986), gushes about his mentor. "Another half a century may pass before the full spectrum of Dr. Shewhart's contributions will be revealed in liberal education, science, and industry," he says. In my own words, Shewhart was light years ahead in his thinking and application of quality control. Although I do hope I don't have to wait until 2036 to share the benefits.

In part two on check sheets, we looked into how to collect data. However, those of us in business need to understand what's going on with data from our processes in real time, and recognize changes. In overtly simplistic terms, a control chart can assist us in identifying changes inside a process, highlight if it is getting better or worse, and whether it's stable and predictable to the degree where we can state, with some confidence, what we could expect it to do tomorrow, the day after, and so on.

I have used control charts in my lean Six Sigma Black Belt projects, and continue to do so today. I have used them to measure variation and whether processes are "in control." I've also used them to demonstrate process improvement, verifying that my changes have worked—or possibly not. The benefit for me is I get a good visual representation of the process going on, and that's what I'll focus on here.

Tool ratings

Difficulty to understand. I rate the control chart a 7 on a scale of 10, with 10 being most difficult. Because control charts are a bit heavy on the statistical side, I think most people would naturally shy away from the mechanics involved; however, using the chart and just interpreting it is a bit simpler.
Difficulty to use. I rate it a 5 out of 10, with 10 being the most difficult. Expressing data points in a chart is much more useful than declaring a single day's value. The control chart is useful for demonstrating data through time, interpreting what's going on in the chart, then taking necessary action.
Difficulty to create on a computer. I rate it a 3 out of 10, with 10 being the most difficult. As you will read below, you can create control charts with many different software options.

How to create one

Remember, this is an oversimplified description. Within each of these steps there are things you need to watch out for. The purpose here is to give you the gist of how a particular type of control chart is built. In this case, an Xbar chart. The Xbar chart is used in conjunction with an R chart, but for the sake of simplicity we will only look at how an Xbar chart is created and interpreted.

1. Gather measurements from your process. Below in figure 1 I have collected some data from a widget company. This company measures its widgets at the end of the production line about four times during each shift. The four shift data form a subgroup. Keep in mind that manpower, machines, materials, methods, measurements, and the environment need to be pretty much the same for each of the four samples that make up the subgroup.

The daily shift average is calculated from the subgroup measurements; the total value is calculated by adding all measurement results together, then dividing by four. Then we take these data points (in time order) and plot them in a line graph. Please be mindful that in the world of statistics, the average is expressed as “Xbar” or , and will not be called the average, but the “mean.”

Figure 1: Graph of subgroups with each point an average of four measurements.

2. Calculate the mean value of all the means in the previous step. This will allow you to plot on the chart, creating a “center line,” which I’ll refer to later as the CL. This CL, the average of the averages is expressed as “XbarBar” or .

3. Calculate the grand range. To calculate a grand range, first determine the range of each of the four readings at each time point, which is just the maximum minus the minimum. The grand range is the average of all the ranges from each time point and is referred to as Rbar.

4. Select control chart constant. The control chart constant comes from a table of mathematically-derived values that involve a lot of statistical stuff that you don't need to know at this point. All you need to know is how many samples were in each of your subgroups, which was four. You now need to refer to the chart at the right for the constant that will be used in a subsequent step. Since our sample size is four, the value for A2 is 0.729.

5. Calculate limits. Now we have the data we need to calculate our limits. The center line we have already defined. It is , or what I have called CL.

Remember that A2 in the equations below comes from the table at right. For a sample of 4, A2 is 0.729.

Our upper control limit or UCL = CL + A2 * Rbar
Our lower control limit or LCL = CL – A2 * Rbar

6. Construct our plot. In the figure below CL is the green line in the middle. The UCL is the red line above CL and the LCL is the red line below CL.

7. Creating zones. Finally, we need to create three evenly spaced zones between the center line and the upper and lower control limits. Simple math so I will let you figure that one out. Starting from the center line, these lines represent 1, 2 and 3 standard deviations, or sigma, from the center line.

8. What does it mean? Remember, the Xbar chart is used in conjunction with an R chart which we haven't covered. In essence, the R chart would indicate whether you should look at the Xbar chart. For the sake of simplicity, we are only going to consider that we have reviewed the R chart and will now look at the Xbar chart to determine if our process is acting strangely. There are many rules for determining when a process is showing unusual behavior, but here are a few common ones.

Rule 1: Any point falls beyond 3σ above or below the centerline
Rule 2: Two out of three consecutive points fall beyond 2σ on the same side of the centerline.
Rule 3: Four out of five consecutive points fall beyond 1σ on the same side of the centerline.
Rule 4: Nine or more consecutive points fall on the same side of the centerline.

There are more, but you get the idea. In a nutshell, the "pattern" of the dots gives you information as to whether or not your process is stable or acting unusual.

I've circled a few of the suspect points or groups of points in figure 2 below.

Figure 2: Xbar chart

Helpful software for using the tool

MS PowerPoint. There is a chart-builder tool inside PowerPoint, but you might have to make your calculations on paper first. Why not just use Excel or another statistical package, and then copy and paste across?

MS Word. The same issue as PowerPoint affects Word.

MS Excel. It may take a bit of jiggery-pokery from you to build a control chart in Excel. I'm aware that on the Internet you can find free, public-domain templates with built-in shortcuts to help you make a control chart, as well as step-by-step guides with gloriously flowing screen shots. However, if you don't have one of these templates and are starting from scratch, you may find yourself spending a lot of time using the "insert function" or creating your own calculations in the spreadsheet. You can do it, but you would need a serious amount of patience. (Note: I do sell "patience" for $200 per hour, minimum order 1,000 hours.)

MS Visio. I don't think it's possible to make a control chart in Visio because it's not really designed for this; therefore I'd steer clear of this software for this type of quality tool.

Minitab. This is the granddaddy for making control charts. Within a couple of mouse clicks, you can create one from the data you enter. Pretend to your colleagues that it's really difficult and takes hours to create, then do one in a minute, and you'll look like either a hero or a liar. All the charts in this column were produced with Minitab software. I love it when someone else can make my work easy, in this case with all the rules and calculations built in, ready for me.

Other statistical software: Basically, if it is a package that is useful for statistical analysis, it will build a control chart simply and quickly for you. There are many out there that I am aware of and have used; however, as I said earlier, this series is for newly initiated quality professionals, and in my experience the ones on I've mentioned are quite common in industry. Should you have a different statistical package not listed here, I'm sure it will be just as useful to you.

Hints and tips

Please try not to interpret control lines with engineering or other tolerances. A process that you measure can be in control and yet not be capable of being within tolerance.

Check and recheck your calculations if you are using an Excel template. Trust me; I've used someone else's template and wondered why I got a wonky average line after running reports.

It will take some practice to quickly read and interpret charts. If you have a neighborhood statistical guru or Black Belt, pay him a visit.

Don't make control charts for the sake of creating a chart. They should have a purpose, like any other analytical tool, and should be used primarily to help your customer.

Too many control charts can be as painful as no control charts at all. If we flood the place with control charts, measuring and anlayzing all these data can lead to some level of paralysis.

Remember the golden rule of probability: Nothing is always certain. If used correctly, control charts are one way to express what could happen next, based only on past history and the span of your control limits. However, I wouldn't want you to fall into any bear trap from unintentionally misleading colleagues, so remember the golden rule.

So there you have it, the third of the seven quality tools as seen by this Improvement Ninja. In the next installment, I will tackle the histogram. Until then, if you would like to share your hints and tips, or even gripes, about the tool or its related software, please add a comment below.

And finally, a quality anecdote:

A small-budget film crew was shooting on a remote mountainous location. Suddenly a quality professional on vacation shows up, walks over to the director, and says, "Tomorrow there will be strong winds with a storm; I'd recommend not filming."

The director took his advice and shut down the set. Sure enough, a storm came, and the prepared director saved a lot of money.

A few days later, the film crew was again preparing to shoot, and the quality professional shows up. "Tomorrow there will be a hurricane," he warns. "No shooting, please."

A hurricane came up, and the alerted director saved even more money.

The quality professional's amazing accuracy when predicting snow, rain, ice, blizzards, lightning, and thunderstorms was saving the director a bundle, and he became fond of this weather genius. They would hang out by the campfire after hours and got to know each other fairly well. The quality professional explained about the seven quality tools, and the director really liked the idea of control charts, especially the predictive benefits they offer. The director was amazed that such a humble man as the quality professional had the skill, knowledge, and wisdom to make such accurate predictions.

Now the director was preparing to shoot an important scene and was waiting for the quality professional to come and predict the weather, but he was nowhere to be found. The director personally went looking for him and found him in a little tent in the forest. He went inside and desperately requested the weather prediction for tomorrow's finale.

"Tomorrow's forecast?'' says the quality professional. "Sorry, I can't provide a prediction."

"What do you mean?" says the director, flabbergasted. "You taught me about control charts and how to recognize if a process is stable! You said you can identify if something special is happening in the process! So what can you say about the weather tomorrow?"

"You are right that I use control charts to understand quality in a process," says the quality professional. "I help the widget company make widgets, and I make the best control charts in the entire industry. But I can't tell you tomorrow's forecast if my radio's batteries are dead."


About The Author

Paul Naysmith’s picture

Paul Naysmith

Paul Naysmith is the author of Business Management Tips From an Improvement Ninja and Business Management Tips From a Quality Punk. He’s also a Fellow and Chartered Quality Professional with the UK’s Chartered Quality Institute (CQI), and an honorary member of the South African Quality Institute (SAQI). Connect with him at www.paulnaysmith.com, or follow him on twitter @PNaysmith.

Those who have read Paul’s columns might be wondering why they haven’t heard from him in a while. After his stint working in the United States, he moved back to his homeland of Scotland, where he quickly found a new career in the medical-device industry; became a dad to his first child, Florence; and decided to restore a classic car back to its roadworthy glory. With the help of his current employer, he’s also started the first-of-its-kind quality apprenticeship scheme, which he hopes will become a pipeline for future improvement ninjas and quality punks.



Paul, I will not be as polite and diplomatic as my good friend Davis. Regarding your calculation of control limits, you are COMPLETELY WRONG!!! You mention Shewhart's work, but obviously did not learn much from it. Have you read the book? Your data is time-ordered. The SD calculation has nothing to do with time order. The SD will be the same value regardless of the order of the data. That alone should be a clue that SD is not the proper statistic to calculate control limits. An "average dispersion statistic" (i.e., average moving range) must be used with the proper bias factor. SD is an enumerative calculation, while a control chart is an analytic tool. There is a huge difference between "enumerative" and "analytical" studies. This concept was well-documented by Deming. Whatever Improvement Ninja level you possess, please demote yourself by two belts, Grasshopper. Your data obviously shows a shift - another clue. So, calculating control limits the proper way from all the data does not complete the analysis. In fact, the Mean and Control Limits for such an out-of-control situation are, in fact, meaningless! The system is obviously not showing a "reasonable degree of statistical control" and should be split into at least two separate periods and the limits recalculated (properly) for each. I'll stop now, before I hyperventilate. Sorry if I seem harsh, but you need to properly educate yourself before writing articles for public view.

Calculation of Standard Deviation

Be prepared for questions when people ask you, "Why don't I get the same limits as Minitab?" -- because one should never use the "typical" calculation of standard deviation for control limits, which is what you describe (As I say to people I teach regarding that, "What part of 'never' don't you understand?").  If there are special causes, as there seem to be in your data, this estimate will be inflated, sometimes seriously.

Given the way you describe the data collection, there are also a couple of different ways you could calculate it -- Wheeler has discussed this in his many "rational subgrouping" columns.

And your alleged special cause -- Point 7 is indeed below the limit.  Maybe it isn't even the special cause.  Rather than that one point below the limit, one should investigate a possible process shift that seems to settle in around observations 16-17.  This SHIFT might be the special cause and Point 7 could actually be common cause before the shift took place!  That said...

...I'd also be concerned about points 5 and 6, which could be a special cause in "process 1" -- as could points 21 and 22 in "process 2."  With the data, one could see whether the moving ranges from point 4 to 5 and point 6 to 7 might be suspect...as might the moving ranges from  20 to 21 and 22 to 23.  You get the idea -- it's not just as simple as calculating limits and looking for points outside the limits.

And if you sub-grouped the data (a possibility given the way you collected it), then the moving range isn't appropriate.  BUT...calculating the standard deviation correctly, THAT is what might flag points 5 and 6 and 21 and 22 as special causes (outside the limits) with their appropriate center lines if the shift is confirmed. 

I appreciate what you're trying to do in these columns on the tools, but I just couldn't let these (relatively) common teaching errors about calculating the standard deviation or interpretation get a free pass.

Just goes to show you, a statistician's favorite answer is, "It depends."  If only it were so simple.  Rather than teach people tools, we should try to teach them how to ask better questions to change conversations.

Davis Balestracci

Great comment Davis, I love

Great comment Davis, I love your last line "Rather than teach people tools, we should try to teach them how to ask better questions to change conversations." so very true. Thank you for taking the time to read my little article.

Simple-yes and no


Well....not as simple as you think.

The "proper" calculation for control limits use an average (local)  dispersion statistics not the global formula you suggest. This use of a local statistic minimizes the effect of out-of -contol points on the limits. As Wheeler says: You get Good Limits from Bad Data.

Read Wheeler....and then read him again.

Rich DeRoeck 

Thanks for the comment Rich,

Thanks for the comment Rich, I agree that Wheeler is a great resource to tap into and a great recommendation for beginners into the field of Quality.

Hey Paul great job. I read

Hey Paul great job. I read your first and second part also of Ninja, and I liked this article of control chart also. You have a great calibar of putting your ideas precisely and effectively. Waiting for your other articles. javascript chart