Featured Product
This Week in Quality Digest Live
Management Features
Lisa Apolinski
Adding what customers want
Megan Wallin-Kerth
Thermo Fisher Scientific has a team that is primarily an IT department dedicated to quality
Sarah Burlingame
Coaching can keep management and employees on track
Michaël Bikard
Receiving outsized credit can encourage individuals to work together, even when it results in lower-quality output.
Gleb Tsipursky
How to reduce employee resistance

More Features

Management News
Former service partner provides honing and deep-hole drilling solutions
Connects people and processes across functional silos with a digital thread for innovation
Better manufacturing processes require three main strategies
Technical vs. natural language processing
Recognized as best-in-class industry technology by Printing United Alliance
It’s unethical for companies to use test tasks as free labor
Numerous new USB3 cameras added to product roster

More News

Scott A. Hindle


Process Capability: What It Is and How It Helps, Part 1

Is it all about computing statistics?

Published: Monday, August 29, 2016 - 09:49

In my August 2015 article, “Process Capability: How Many Data?” I discussed whether 30 data were the “right” number in an analysis of process capability. In this four-part series, the focus is on understanding what process capability is and the pitfalls associated with it, along with how it can help manufacturers develop process knowledge, reach better decisions, and take better actions.

Product 874: What is process capability?

The story starts with Alan, a relative novice in the field of process capability, who was assigned the task of writing a report on the process capability for a key product characteristic of Product 874, a powder product. The 56 data values he received are found in figure 1. Alan’s brief was to use these data to write a report covering:
• The process capability results for the characteristic under study
• An interpretation of the results
• An appendix of all calculations in Excel for traceability purposes

Figure 1: Sample data for a product characteristic of Product 874.

Unaware of some of the pitfalls that must be avoided, Alan jumped straight into the computations. He used the formulas for the two capability indexes Cp and Cpk he remembered from a training class some months back, which He’d written down as:

• USL and LSL stand for upper and lower specification limit
• SDprocess stands for “process standard deviation”
• min{} means minimum of what is in the brackets

Using the 56 data, Alan used Microsoft Excel to obtain:
• Average: 9.799 (Excel function “AVERAGE” applied to all 56 values)
• Standard deviation: 0.421 (Excel function “STDEV” applied to all 56 values)

Given the specifications of 8.30 to 11.30, Alan obtained:

When he rounded these figures to two decimals, the Cp and Cpk values were both 1.19. Alan had been told that a minimum capability of 1.33 was the standard expectation. He wasn’t sure exactly where this 1.33 came from, but he saw that his capability statistics of 1.19 fell a little short of this. Unsure how to proceed, Alan arranged a meeting with Sarah, a colleague. To be well prepared, he reviewed some of his statistics material to see how he might proceed.

Alan continued with a statistical test for normality, having learned that Cp and Cpk statistics depend on normality. Using the Anderson-Darling test, he calculated a p-value of 0.064. With a p-value greater than 0.05 in hand, Alan was satisfied that his data were normally distributed. He decided the next step was to convert his capability statistics into PPM values (parts per million values) to present the information in more management-friendly language.

At this point, Alan realized the importance of normality: A normal probability model made possible the computation of PPM estimates. Using z-scores (e.g., a Cp of 1.189 has a z-score of 3.566), Alan calculated:
• For Cp, a PPM value of 362.249, or 0.036 percent (assuming the process is perfectly centered)
• For Cpk, a PPM value of 362.253, or 0.036 percent (based on the current process average)

Interpreting these values to mean that 100 – 0.036 = 99.964 percent of process output is within specification, Alan thought there might be a case for this process being considered capable. He was curious to hear Sarah’s opinion.

Having made quicker progress than expected, Alan computed a few more statistics in case they helped to better understand the problem at hand. He summarized his statistics for the discussion with Sarah as shown below in figure 2.

Figure 2: Summary of Alan’s statistics for Product 874 data

Finally, Alan made a note to mention that the process was on target (target: 9.80). Using a one-sample t-test, the high p-value of 0.992 reassured him that this conclusion was spot on.

The discussion with Sarah

Alan got the discussion going by referring to figure 2 and a printout on which the following questions were written:
• Why is the minimum capability standard set at 1.33?
• Is 99.964 percent conforming satisfactory?
• What do the confidence intervals for Cp and Cpk actually represent, and how should these confidence intervals be interpreted relative to the standard of 1.33?
• Would Sarah mind checking the Excel file for possible mistakes in the calculations?
• Are there any statistics missing that need to be included, such as PPM values for the lower, and upper Cp and Cpk confidence intervals?
• If capability isn’t satisfactory, should Alan volunteer to lead an improvement project to achieve the minimum capability standard of 1.33?

Sarah started by explaining that, for a perfectly centered process with two specifications, a Cp of 1.33 is the same as ± four standard deviations from the midpoint of the specifications. This gives some “safety space” equivalent to one standard deviation to reduce the risk of out-of-specification process output when a process is subject to assignable causes, as most are from time to time. She also explained that, theoretically, a capability of 1.33 corresponds to a compliance fulfillment of 99.993 percent. She cautioned Alan that these theoretical figures based on specific probability models are often over-interpreted and misused.

Alan was about to bring up the next question when Sarah asked how he’d looked at process stability in his assessment of capability. Before he could respond, Sarah’s phone rang. She chatted for a moment before informing Alan that the meeting would have to be cut short. They agreed to continue the next day. In the meantime Sarah gave Alan two papers before heading off, commenting that they might be of some use. The first, titled “Statistics and Medicine” by David and Sarah Kerridge (ASQ Statistics Division Newsletter, 2000, pp. 18–19), had some text highlighted.

“There is a false idea that statistics is part of mathematics. Mathematics is exact, and so therefore is statistics. But let us look at the kind of problem which I was taught to solve, many years ago, when I was learning mathematics:

“‘It takes two men 10 days to dig a hole. How long will it take five men?’

“In those days women didn’t dig holes. But that is the only thing about this piece of mathematics that was correct. We were supposed to answer, ‘Four days,’ reasoning that digging the hole is 20 man-days of work. But is that true?

“Perhaps five men will be faster because they can work as a team. Or they may be slower because they get in each other’s way. It depends on the shape of the hole. And why should we suppose that the different men would all work at the same rate?

“But that is mathematics for you. It deals very exactly with one part of the problem and ignores everything else. It ignores the connection between the calculation and the real world.

“Statistics is concerned above all with the real world. It deals with the difficulties of getting and interpreting evidence. In other words, it is part of the scientific method, rather than mathematics. Mathematics is quick. Science is slow but sure.

“The basic problem is illustrated by the hole-digging example. It is easy to get the calculations right. It is difficult to see how they apply to the real world. This relies on knowledge, experience, and, ultimately, good judgment.”

A little further down the page Alan read:
“Statisticians, on the other hand, specialize in problems where interpretation is difficult because the information we can get is inexact and possibly misleading.”

Finally, there was a handwritten note on the second page:
“Statistics summarize everything down into one number. If the data have two or more stories to tell, how can one summary statistic do the job?”

Alan saw no relevance for himself in the sentence on statisticians because he wasn’t one. He also didn’t really get how a production process can have different stories to tell. He was confident his calculations were right; hence, what could possibly be misleading? Wasn’t the pending decision now to decide if a Cp and Cpk of 1.19 was satisfactory or not? If not, what should Alan recommend?

The next morning it dawned on Alan that he was starting to think differently about the capability data he’d analyzed. So far his output was a multitude of calculated statistics. But what was the connection between the calculated values and the actual process? What “knowledge, experience, and judgment” did he have to make this connection? He challenged himself, asking what he actually knew about the data he was using. He accepted that a seed of doubt had planted itself in his head.

Note: If you didn’t pick up on at least one mistake that Alan made, we’ll get to that in subsequent parts of this article. A lack of understanding of process capability can you set you off on a wild goose chase. More about that in part two.


About The Author

Scott A. Hindle’s picture

Scott A. Hindle

Scott Hindle supports R&D and factory operations on process capability studies for new products and processes, statistical process control (SPC) for use in routine production, and the use of online measurement devices as a part of both SPC and engineering process control.


Understanding SPC

Alan should read Dr Wheeler's "Understanding SPC" and try again.