Quality Digest      
  HomeSearchSubscribeGuestbookAdvertise June 28, 2017


This Month
ISO 9000 Database
Contact Us
Web Links
Web Links
Web Links
Web Links
Web Links
Need Help?
Web Links
Web Links
Web Links
Web Links
ISO 9000 Database
ISO 9000 Database

Departments: SPC for the Real World

Davis Balestracci

TQM, Six Sigma, Lean and… Data?

Too often we lose sight of these methods' original intent and drown in minutiae.


I believe W. Edwards Deming's philosophy of quality has proven the most robust. As a friend of mine characterized almost 20 years ago, "Deming is true religion, Juran is a very good church historian and Crosby is a TV evangelist."

After recently hearing a world expert on lean, I realized I'd been unintentionally teaching that methodology these last 18 years. What people don't realize is that all of these movements evolve out of the same sound, robust theory. When this theory manifests in a straightjacketed fad du jour, people lose sight of the original intent and begin arguing over minutiae, and drowning in them.

Brian L. Joiner's book Fourth Generation Management (McGraw-Hill, 1994) may be the best overall quality book available. Everyone going through the legalized torture of Six Sigma belt training should read it to understand what the role of statistics is really all about. Jim Clemmer's Firing on All Cylinders (TCG Press, 1992), which integrates theory and the "cultural" aspects of improvement into a robust structure, runs a close second.

Every alleged monumental savings brought about by Six Sigma makes me think of a quote attributed to Deming: "All you've done is get your processes to where the hell they should have been in the first place. That is not improvement." Getting rid of underlying waste is indeed a nontrivial accomplishment, but once that's accomplished, there's nothing left: That rate of savings won't sustain itself.

The first edition of The Team Handbook (Joiner Publishing, 1988) contains what I consider the best high-level summary of improvement. It's all about process. There are six sources of problems with a process: 1) Inadequate knowledge of how a process works

2) Inadequate knowledge of how a process should work

3) Errors and mistakes in executing procedures

4) Current practices that fail to recognize the need for preventive measures

5) Unnecessary steps, inventory buffers, and wasteful measures and/or data

6) Variation in inputs and outputs


Key elements of lean include process map- ping and error proofing (steps 1 and 4) as well as an obsession with waste (step 5). Statistics applies to steps 2, 3 and 6 to expose and reduce inappropriate and unintended variation. This will involve using statistical techniques to test theories, assess interventions and hold gains. Improving quality means improving processes to make them as consistently predictable as possible.

Consider the use of statistics as a data process, or rather four processes: measurement, collection, analysis and interpretation. Each of these has the six classic sources of process inputs--people, methods, machines, materials, measurements and environments--and each can potentially be improved by the algorithm above.

Measurement. Are the data operationally defined with clear objectives? Deming was fond of saying, "There is no true value of anything!"

Not only that, do the people collecting the data know why they're collecting the data and how to collect them?

Collection . Don't make the assumption that data are being collected in a way that makes your proposed analysis appropriate. Commonly, data collected originally for another purpose are tortured until they confess to a project's hidden agenda. Many times this is an amalgam of continuously recorded administrative procedures (CRAP).

Analysis. Are you aware that your analysis should be known before one piece of data has been collected?

Interpretation. Did you know that statistics is a set of techniques used not to "massage" data but to appropriately interpret the variation with which you're faced in a situation, which can be one of two types, and that treating one as the other actually makes the situation worse?


Think of many meetings--in essence, analytical and interpretive processes--where tabular data, bar graphs and trend lines are compared to arbitrary goals, resulting in interventions. Could "human variation" in perception also manifest through the six input sources to affect the quality of each of the four data processes and subsequent decisions? How many meetings are reactions to variation in the data processes and not necessarily the process attempting to be improved?

I leave you with the eight crucial questions of the data process:

Why collect the data?

What methods will be used for the analysis?

What data will be collected?

How will the data be measured?

How often will the data be collected?

Where will the data be collected?

Who will collect the data?

What training is needed for the data collectors?


Statistics isn't the science of analyzing data but rather the art and science of collecting and analyzing data simply and efficiently.

About the author
Davis Balestracci is a member of the American Society for Quality and past chair of its statistics division. Visit his Web site at www.dbharmony.com.