Featured Product
This Week in Quality Digest Live
Quality Insider Features
Constance Noonan Hadley
The time has come to check whether the benefits of teamwork still outweigh the costs
Lily Chen
The cornerstone of cybersecurity
Jeremy L. Boerger
To keep your business running, you need visibility into your IT assets
Elizabeth Gasiorowski Denis
An inclusive approach to designing products and services guarantees accessibility to as many consumers as possible
Naresh Pandit
Enter the custom recovery plan

More Features

Quality Insider News
Sapphire XC will ship in late Q3 beginning with aerospace companies
Major ERP projects take six months longer than companies were told
Program inspires leaders to consider systems perspective for continuous improvement and innovation
Collaboration produces online software for collecting quality inspection data
Serving the needs of employers and educators
Powder reuse schemes affect medical device performance
MIT course focuses on the impact of increased longevity on systems and markets
Upgraded with blue laser technology
Delivers time, cost, and efficiency savings while streamlining compliance activity

More News

Jay Arthur—The KnowWare Man

Quality Insider

Dumb It Down or Simple It Up?

Time to lose the Six Sigma jargon

Published: Tuesday, May 29, 2012 - 11:22

At the ASQ World Conference held in Anaheim last week, I ran into my old friend Jack Revelle, author of many SPC books and videos. He said clients were constantly asking him to take Six Sigma and “dumb it down.” Surprisingly, despite everything the Six Sigma community knows about the voice of the customer, this simple request goes largely unheard.

Instead, we insist that these Six Sigma newbies learn our language. Consider terms like “nonparametric” and “null hypothesis.” One might as well be speaking Swahili or Klingon. Like an American in Paris, we don’t even consider learning how to speak their language.

Many years ago, Tom DeMarco, author of Structured Analysis and System Specification (Prentice Hall, 1979), said something that has stuck with me: “Making complex topics simple is a huge intellectual feat.” I’d like you to consider that people are saying “dumb it down” when what they’re really mean is “simple it up.”

Having worked in a phone company for years, I figured out that most large corporations are largely devoid of statisticians, and any attempt to convert the rank and file into statisticians would take too long and cost too much to deliver any real value. So, I decided to seek ways to “simple up Six Sigma” so that everyone could “get it” without all of the pain and suffering and mental withdrawal. Through my YouTube Money Belt training, writing Lean Six Sigma Demystified (McGraw-Hill, 2010), and software, I keep working on unlocking the mysteries of lean Six Sigma for the layperson.

The only way I can see to achieve the level of lean Six Sigma adoption and worldwide quality that we all hope for is to “dumb it down” and “simple it up” to the point that anyone can get it. On YouTube, the Khan Academy, funded by the Gates Foundation, is attempting to create a “world-class education” that anyone can take for free. Watch some of Khan’s statistics videos. He knows how to “simple it up.”

So, stop hanging on to tired old Six Sigma jargon. Stop speaking statistician. Start learning how to speak layperson. Start thinking about how to dumb it down and simple it up. For lean Six Sigma to thrive, it must first take root in a culture. The soil has to be prepared and tended with care.

Complexity of language, methods, and software are barriers to success. One attendee lamented that his company owns 500 copies of the most widespread (and most complex) SPC software but no one uses it. Complexity prevents adoption and usage.

Isn’t it time we started listening to the voice of our customers, who continue to demand that we “dumb it down” and “simple it up” to meet their needs, not ours?


About The Author

Jay Arthur—The KnowWare Man’s picture

Jay Arthur—The KnowWare Man

Jay Arthur, speaker, trainer, founder of KnowWare International Inc., and developer of QI Macros for Excel, understands how to pinpoint areas for improvement in processes, people, and technology. He uses data to pinpoint broken processes and helps teams understand their communication styles and restore broken connections. Arthur is the author of Lean Six Sigma for Hospitals (McGraw-Hill, 2011), and Lean Six Sigma Demystified (McGraw-Hill, 2010), and QI Macros SPC Software for Excel. He has 30 years experience developing software. Located in Denver, KnowWare International helps service and manufacturing businesses use lean Six Sigma tools to drive dramatic performance improvements.


Dumb Enough

Believers in the Six Sigma nonsense are already dumb enough.  Simple minded followers of any specification based methodology should come out of the Dark Ages and read Shewhard, Deming and Wheeler !


I swear if I see one more p value I will scream...I do get tired of having to explain that a mathematically precise and statistically significant 'difference' is precisely incorrect and/or irrelevent.  I have taught quality improvement and product design using statistically sound methods for years now (under the guise of 'six sigma'.  I have recently had to bring out the phrases enumerative and analytic to counter a resurgence of "statistical precision must be used in every case" in my organization. 

There are many excellent instructors and practicioners of analytic studies out there - its just really hard to find them in the forest of statistical clerics.  We need to hear more from them!

Thank you and Amen!

As you say, no magic to Lean and 6Sig - all depend on some sort of process analysis & management - ISO 9001 and CMMI, as well.  The things that both amuse and dismay me about 6Sig are the cultishness of the jargon and, yes, the statistical overkill (at least as some apply it).  Both inhibit adoption by the "lay" community. 

As you said, "Complexity prevents adoption and usage."  Or as one of my old human factors pals used to say, "A useless product or service won't be used.  A useful product or service won't be used if it's not usable."  Yep -that's how the world works.  If people can't understand the concepts, forget buy-in, destined to be avoided and forgotten.  If the lingo is a dis-enhancer, fix the lingo - then you'll at least have a shot. And focus on just the statistical items that make sense for the scope at hand.  You don't get extra credit for complexity.

If you want something to be used, it's gotta be usable (understandable).  

OK - time to quit foaming at the mouth ...


Simple it up; I'll use that!

I couldn't agree more, with the article or with Cliff. I've been saying it for years...we've been putting the enumerative cart before the analytic horse for too long. It's funny, though...I've been speaking at conferences on that for a while now (especially on the "no control charts until C" problem), and people who come to my presentations always give feedback that essentially says, "of course we should do that...why are you bothering to tell us this?" Yet, the ASQ Body of Knowledge for Six Sigma doesn't have control charts until the control phase (although we are somehow expected to do capability studies in D or M). Rogers, in "Diffusion of Innovations," describes factors that affect speed of adoption of a new idea or innovation: Relative Advantage, Complexity, Compatibility, Trialability and Observability. It's not too hard to make a case that data-based methods for quality improvement (like Six Sigma) have pretty strong relative advantage, trialability and observability. It's much more diffiicult to overcome complexity and compatibility hurdles, especially in the U.S. where aversion to math has become a cultural norm.

We could be on the wrong channel...

Philip Crosby once told me that if you go to the local college and ask a professor to teach you about "quality" that very soon you would be learning the ideas that got you into trouble in the first place. Much of the complexity that has shown up in Six Sigma is the emphasis on enumerative statistics. Much of which is not useful for analytic problems. In the book, Quality Improvement though Planned Experimentation by Moen, Nolan and Provost, Deming wrote the following in the Forward:


This book by Ronald D. Moen, Thomas W. Nolan, and Lloyd Provost breaks new ground in the problem of prediction based on data from comparisons of two or more methods or treatments, tests of materials, and experiments.Why does anyone make a comparison of two methods, two treatments, two processes, or two materials? Why does anyone carry out a test or an experiment? The answer is to predict—to predict whether one of the methods or materials tested will in the future, under a specified range of conditions, performs better than the other one.

Prediction is the problem, whether we are talking about applied science, research and development, engineering, or management in industry, education, or government. The question is, What do the data tell us? How do they help us to predict?Unfortunately, the statistical methods in textbooks and in the classroom do not tell the student that the problem in the use of data is prediction. What the student learns is how to calculate a variety of tests (t-test, F-test, chi-square, goodness of fit, etc.) in order to announce that the difference between the two methods or treatments is either significant or not significant. Unfortunately, such calculations are a mere formality. Significance or the lack of it provides no degree of belief—high, moderate, or low—about prediction of performance in the future, which is the only reason to carry out the comparison, test, or experiment in the first place.

Any symmetric function of a set of numbers almost always throws away a large portion of the information in the data. Thus, interchange of any two numbers in the calculation of the mean of a set of numbers, their variance, or their fourth moment does not change the mean, variance, or fourth moment. A statistical test is a symmetric function of the data.In contrast, interchange of two points in a plot of points may make a big difference in the message that the data are trying to convey for prediction.

The plot of pints conserves the information derived from the comparison or experiment. It is for this reason that the methods taught in this book are a major contribution to statistical methods as an aid to engineers, as well as to those in industry, education, or government who are trying to understand the meaning of figures derived from comparisons or experiments. The authors are to be commended for their contributions to statistical methods.                                    W. Edwards Deming                                    Washington, July 14, 1990

We would do well to get back to a plot of points to under the data before jumping to the conclusion that the process must be redesigned. Unfortunately, many do not entertain the control chart until "C" in DMAIC, this is too late and too much time and money have been wasted following a yellow brick road of tools. As Shewhart noted, processes in nature are inherently stable. Man made processes are inherently unstable. The first job is to understand the variation. Are we dealing with an unstable system (probably) or a stable system producing only common cause variation. Answering this question in "C" is too late.

Clifford Norman