Featured Video
This Week in Quality Digest Live
Six Sigma Features
Anthony D. Burns
Why has it taken so long to understand that processes need analytic methods, not enumerative ones?
Cheryl Pammer
Using intervals to get at the tail ends of the problem
Rip Stauffer
It helps to build a table
Mike Richman
A conversation with Neil Polhemus
Matthew E. May
One place where root cause analysis has no real place is in strategy formulation

More Features

Six Sigma News
SQCpack and GAGEpack offer a comprehensive approach to improving product quality and consistency
Customized visual dashboards by Visual Workplace help measure performance
Helps manufacturers by focusing on problems and problem resolution in real time
Ask questions, exchange ideas and best practices, share product tips, discuss challenges in quality improvement initiatives
Says capitalization gives false impression that Six Sigma is more significant than other methodologies
His influence on the methodology can’t be denied
Nov. 30, 2016, in Copenhagen
A story about how organizations rise and fall—and can rise again

More News

Eston Martz

Six Sigma

Five More Critical Six Sigma Tools: A Quick Guide

Getting familiar with these tools is a good way to get started on your quality journey

Published: Wednesday, September 6, 2017 - 12:02

The Six Sigma quality improvement methodology has lasted for decades because it gets results. Companies in every country around the world, and in every industry, have used this logical, step-by-step method to improve the quality of their processes, products, and services. And they’ve saved billions of dollars along the way.

However, Six Sigma involves a good deal of statistics and data analysis, which make many people uneasy. Individuals who are new to quality improvement often feel intimidated by the statistical aspects.

Don’t be intimidated. Data analysis may be a critical component of improving quality, but the good news is that most of the analyses we use in Six Sigma aren’t hard to understand, even if statistics isn’t something you’re comfortable with.

Just getting familiar with the tools used in Six Sigma is a good way to get started on your quality journey. In my last column, I offered a rundown of five tools that crop up in most Six Sigma projects. Here, I’ll review five more common statistical tools, and explain what they do and why they’re important in Six Sigma.

1. T-tests

We use t-tests to compare the average of a sample to a target value, or to the average of another sample. For example, a company that sells beverages in 16-oz. containers can use a 1-sample t-test to determine if the production line’s average fill is on or off target. If you buy flavored syrup from two suppliers and want to determine if there’s a difference in the average volume of their respective shipments, you can use a 2-sample t-test to compare the two suppliers. 


Where t-tests compare a mean to a target, or two means to each other, ANOVA—which is short for analysis of variance—lets you compare more than two means. For example, ANOVA can show you if average production volumes across three shifts are equal. You can also use ANOVA to analyze means for more than one variable. For example, you can simultaneously compare the means for three shifts and the means for two manufacturing locations. 

3. Regression

Regression helps you determine whether there’s a relationship between an output and one or more input factors. For instance, you can use regression to examine if there is a relationship between a company’s marketing expenditures and its sales revenue. When a relationship between the variables exists, you can use the regression equation to describe that relationship and predict future output values for given input values.

4. DOE (design of experiments)

Regression and ANOVA are most often used for data that have already been collected. In contrast, design of experiments (DOE) gives you an efficient strategy for collecting your data. It permits you to change or adjust multiple factors simultaneously to identify whether relationships exist between inputs and outputs. Once you collect the data and identify the important inputs, you can then use DOE to determine the optimal settings for each factor. 

5. Control charts

Every process has some natural, inherent variation, but a stable (and therefore predictable) process is a hallmark of quality products and services. It’s important to know when a process goes beyond the normal, natural variation because it can indicate a problem that needs to be resolved. A control chart distinguishes “special cause” variation from acceptable, natural variation. These charts graph data over time and flag out-of-control data points, so you can detect unusual variability and take action when necessary. Control charts also help you ensure that you sustain process improvements into the future. 


Any organization can benefit from Six Sigma projects, and those benefits are based on data analysis.  However, many Six Sigma projects are completed by practitioners who are highly skilled, but not expert statisticians. A basic understanding of common Six Sigma statistics, combined with easy-to-use statistical software, will let you handle these statistical tasks and analyze your data with confidence. 


About The Author

Eston Martz’s picture

Eston Martz

For Eston Martz, analyzing data is an extremely powerful tool that helps us understand the world—which is why statistics is central to quality improvement methods such as lean and Six Sigma. While working as a writer, Martz began to appreciate the beauty in a robust, thorough analysis and wanted to learn more. To the astonishment of his friends, he started a master’s degree in applied statistics. Since joining Minitab, Martz has learned that a lot of people feel the same way about statistics as he used to. That’s why he writes for Minitab’s blog: “I’ve overcome the fear of statistics and acquired a real passion for it,” says Martz. “And if I can learn to understand and apply statistics, so can you.”