PROMISE: Our kitties will never sit on top of content. Please turn off your ad blocker for our site.
puuuuuuurrrrrrrrrrrr
Dirk Dusharme @ Quality Digest
Published: Wednesday, May 4, 2011 - 05:30 Donald J. Wheeler has been awarded the Deming Medal by the American Society for Quality (ASQ) for “the propagation of Dr. Deming’s ideas throughout the world through his numerous books and seminars on quality management and statistical quality improvement.” Here he answers some questions for Quality Digest Daily. Quality Digest Daily: Please give us a synopsis of your career. Donald J. Wheeler: I received a bachelor of arts in mathematics from the University of Texas, and a master of science and a Ph.D. in statistics from Southern Methodist University. I taught at the University of Tennessee, Knoxville, from 1970 to 1982, where I was an associate professor of Statistics. Since 1982, I have worked as an instructor and consulting statistician, conducting more than 1,000 seminars with more than 250 organizations in 17 countries. In 1986, my wife and I founded SPC Press and began publishing books on quality and productivity. I am the author or co-author of 25 books and have written monthly columns for Quality Digest and other quality publications. I have been elected as a fellow of both the American Statistical Association and the American Society for Quality. QDD: How did Dr. Deming influence you? Wheeler: I met Dr. Deming at the Joint Statistical Meetings in Montreal in 1972. Since I had been teaching SPC at the University of Tennessee, and had also read [Dr. Walter A.] Shewhart’s Economic Control of Quality of Manufactured Product [D. Van Nostrand Co. Inc., 1931], I was able to begin to understand what Dr. Deming was talking about. I got a further exposure when Dr. Deming came to Tennessee as Guest Lecturer in 1974. While these lectures furthered my understanding of using data as the basis for taking action, it wasn’t until I was able to “assist” Deming in a seminar in Cincinnati in the fall of 1981 that I came to understand the profound difference between data analysis and mathematical statistics. In the summer of 1982, I resigned from the university so that I could teach in industry full time. During the next 12 years, I assisted Deming in many additional seminars, learning more each time. He invited me to present papers at his NYU seminars for statisticians, and we worked together both here and in the United Kingdom [UK]. I started writing books in the mid-1980s, and my wife, Fran, became both my editor and my publisher. Through my work with Deming in the UK, I came to know Dr. Henry Neave. Out of this association we came to publish Neave’s outstanding summary of Deming’s message, The Deming Dimension, in 1990. Following the publication of this book, Deming called our office and asked Fran to edit the biography that his secretary, Cecelia Killian, had prepared. Fran did, and that is how we came to publish Deming’s biography in 1992 [The World of W. Edwards Deming]. So, over a period of 21 years, I had the privilege of being Deming’s student, assistant, and colleague. QDD: Do you think that Deming’s ideas are still as valid today as they were 20, 30, or 40 years ago? Wheeler: All profound ideas are timeless. While the details may change, the underlying principles remain the same. Deming’s fundamental ideas came, in part, from his association with Shewhart and his concept of an operational definition. I like to summarize the three parts of an operational definition with three questions: “What do you want to accomplish? By what method will you accomplish it?” and “How will you know when you have accomplished it?” For years I observed managers telling Deming all the good things that they were going to do and heard him respond with one of two questions: “By what method?” or “How will you know?” It is just that basic. Until you can answer these two questions, all you have is wishful thinking. So, whether we call it the Shewhart cycle, the Deming cycle, [plan-do-check-act] PDCA, [plan-do-study-act] PDSA, or [define, measure, analyze, improve, control] DMAIC, Deming’s ideas are found at the heart of every effective improvement effort. Such ideas, like all profound truths, are eternally valid. This also applies to the analysis of data. Since process behavior charts serve as an operational definition of how to get the most out of any process, they remain the foundation for successful process improvement. They work by highlighting those places and times when your process is changing, so that you can focus on these periods of change and discover the forces that affect your process. This idea about how to turn change into an opportunity to learn can be traced all the way back to Aristotle. It has been valid for more than 2,300 years. It is still valid today, and it will remain valid in the future. Those who focus on the essentials of data analysis will know this. Those who are always chasing the latest business fad and who want software that will automate all the different statistical tools are in danger of being overwhelmed with details and forgetting this fundamental truth. QDD: Do you find that quality professionals, particularly the younger ones, are familiar with Deming and his ideas? Wheeler: No. On the whole they do not understand what Deming actually taught. But this has been true in the past as well. In 1980 there were only a handful of people who really understood the message of Shewhart and Deming. Most of these individuals belonged to one of three groups: They were either on the faculty at the University of Tennessee, had been through the program at Tennessee, or had been Deming’s students at NYU. Following the NBC white paper, “If Japan Can, Why Can’t We?” the demand for Deming and for SPC training quickly outstripped this supply of competent instructors, with predictable results. Deming always said that students could not evaluate the quality of their instruction until their careers were over. For this reason, he said that students deserved to learn from a master, not a novice. However, beginning in the 1980s, and continuing down to the present, we have had novices doing much of the training. As a result, the teaching of SPC and industrial data analysis went out of control, with all kinds of erroneous ideas, fallacious techniques, and complete misunderstandings being carefully handed down from one group to the next. QDD: You have been a statistician for 41 years. Where do you see the biggest need for improvement in terms of industrial statistics? Wheeler: I have learned that there is a world of difference between what we teach and the art of data analysis. One of my earliest experiences with this came when working with a group of engineers during the early 1970s. The data were the type that I had been taught to transform in order to achieve independence between the variation and the averages. When I presented my results the lead engineer asked me why I had transformed the data. I replied that the transformation made the analysis work better. She then said that my nonlinear transformation had destroyed the interpretability of the data, and that we needed to use an analysis technique that would work with the original data. Since I was still in thrall to the techniques, I did not fully appreciate her point of view until much later. I had not yet discovered that the purpose of analysis is insight, and that the best analysis is always the simplest analysis that produces the needed insight. A few years later I was again working with a group of engineers on a study of the effect of the type of on-street parking upon mid-block accident rates. Since experiments were impossible, they were analyzing the existing data, which were very sparse. Out of more than 7,000 factor combinations, we had data for only 67 cells and multiple observations for only 45 cells. But not to worry. With the brand-new power of SAS 76, I was able to obtain a beautiful analysis, complete with a printout of all of the (complicated) comparisons used to find each of the different types of sums of squares. At this point I knew the answers, but I found that I could not explain them to the engineers in such a way that they could, in turn, explain them to the Federal Department of Transportation, which was funding the study. To get around this obstacle I went back to the data and defined simple comparisons that were easy to interpret and would be easy, therefore, to explain and understand. I then ranked these comparisons in order, beginning with the potential signals (those that were most unlikely to occur by chance) and proceeding down to the probable noise (those that were likely to have occurred by chance). There were only four comparisons that involved the degree of utilization of the on-street parking, and these four comparisons topped the list. This suggested that the degree of utilization was the major factor that affected the midblock accident rate. The higher the utilization, the higher the accident rate. The comparisons between the different types of on-street parking (angle parking versus parallel) were no higher than No. 14 on the list. And in between, at position No. 10, was a comparison that everyone agreed had to be a spurious result (and was therefore due to noise). This was interpreted to mean that the type of on-street parking did not have a pronounced effect upon the midblock accident rate. Based on this analysis the engineers were comfortable in telling the Federal Department of Transportation that the type of on-street parking (parallel or angle) had no effect upon the midblock accident rate, but that it was the degree of utilization that mattered. Fully utilized parallel parking turned out to be just as dangerous as fully utilized angle parking. Any type of on-street parking is perfectly safe—as long as no one uses it! The point of these two stories out of my past is that when we take our client’s data, push them through a black box, and give our client the results, our client can only take those results on faith. Until the results are understandable, no one can (or will be willing to) use the data to take action. And to understand the results the client will need to understand enough of the analysis to see the linkage between the data and the results. (This is where good graphs of the data can save the day.) Yet as I read articles today I find more and more complex analyses being offered simply because we have the computing power to do so. Whether the analysis makes sense does not seem to be part of the equation. A few years ago I got a call from an executive in the South African Revenue Service. He could not understand what his employees were doing in their efforts to identify which returns to audit. Since I had worked with him when he ran a South African Bank, he wanted me to look at what his people were doing, and also to develop a technique that he and other administrators could understand. As I expected, they were trying to develop a classification model. This type of model seeks to find a combination of input variables that will correlate with the value of some outcome variable. I asked for a random sample from their data base and received a file containing some 30,000 records for each of three years. As I reviewed these data I discovered that they did not have a true outcome variable for their classification analysis. They were basically fitting the input variables to themselves. Given the lack of an outcome variable, I defined a simple way to use the input variables to define affinity groups and then to identify outliers within each group. Then I asked my client’s technical staff to apply their technique to the same sample I was using to see to what extent our two approaches overlapped. Their response was that they could not get their technique to work with only 30,000 data! Whenever an analysis technique is so convoluted that 30,000 data are not sufficient to get the technique to work, that technique is almost certainly fitting the noise of the data rather than finding the signals within the data. This is simply one more example where the ability to do a set of computations has triumphed over common sense, and the use of the technique has become more important than making sense of the data. The ability to do a computation does not guarantee that the result will be meaningful. For example, any number of computer packages will allow you to compute the average and standard deviation for a set of telephone numbers. You might even, if you are so inclined, test the telephone numbers for normality. QDD: What do you believe are the most important issues in the quality field today? Wheeler: Since my whole career has been one of teaching statistics to nonstatisticians, I have a different perspective than most of my colleagues in statistics. This perspective has convinced me that we are being held back by our use of the academic pedagogical model in our industrial training. To explain what I mean by this, I need to tell you a bit more about my own journey. At Tennessee we were teaching the usual statistical curriculum with the addition of SPC. As I mentioned earlier, we would often get students excited about what they could do with data analysis. But by the time they were in a position to use these techniques in their work, they would often have forgotten how. (Like the superintendent I encountered who had had an SPC class in college but had no idea that he could use it to run his steel furnaces more effectively.) In short, there was very little transfer from the classroom to real life. Then, during the first nine months of 1982, I had the opportunity to teach several SPC classes in industry. The students literally went out of the classroom and started successfully using the techniques the very next day. I could see how these classes were making a difference both in the lives of the students and in the operation of their organizations. Thus, by the summer of 1982 I had experience with two different pedagogical models for teaching statistics to nonstatisticians—one that worked and one that did not work. So what was the difference between these two pedagogical models? Why did one work and the other not work? Well, the first difference was one of timing. The SPC classes had been taught to those who needed to analyze data. Nevertheless, I am convinced that it was not just a matter of timing. If I had insisted on teaching a standard engineering statistics curriculum of description, probability, and inference, I would not have had the same success that I had with SPC. The reason for my success with the SPC classes has to do with the systematic approach to data analysis which they contained. When professional statisticians analyze data, we quickly learn to check both the context and the data themselves for homogeneity. We learn to look for anomalies. Based on these somewhat subliminal clues, experienced analysts will avoid pitfalls that would trap the unwary. Thus, professional statisticians will commonly use their deep knowledge and experience to deal with the question of homogeneity in an ad hoc fashion. While this may work for experienced analysts, it is very hard to teach to our students. Without this deep knowledge and experience, how are nonstatisticians going to successfully avoid the pitfalls when they analyze data? Rather than leaving the question of homogeneity to be approached in an ad hoc manner, we need to give nonstatisticians some systematic way to consider the question of homogeneity. And when SPC is taught correctly, this is precisely what they get. When nonstatisticians learn the secret of rational subgrouping and rational sampling, they find that process behavior charts provide a simple and powerful way to examine a data set for homogeneity. If the data are homogeneous, then the questions of inference, description, and prediction all make sense. If the data are not homogeneous, then the underlying process is subject to the effects of dominant assignable causes. Moreover, until they discover the nature of these assignable causes, and take steps to remove their effects from their process, the questions of inference, description, and prediction are all moot. Finally, since assignable causes, by definition, have a dominant effect upon a process, steps taken to remove the effect of an assignable cause will substantially reduce the variation within the product stream. Hence, a properly taught class in SPC will equip nonstatisticians to make sense of their data, to avoid the pitfalls of a faulty analysis, and to substantially reduce their process variation. One day a student of mine, a self-admitted math phobic, brought me an XmR chart. Her subordinates in a plant in Europe had done an experiment comparing some antiperspirant compounds. Their one-way [analysis of variance] ANOVA was significant, and so they claimed that their new compound worked. My student could not understand their analysis, and so she had put their data on an XmR chart. As she looked at the chart, she could see only one real difference: Every member of the test panel had been right-handed, and so their right arms had perspired more than their left arms! It turned out that the compounds were not detectably different at all. And it was the simple chart that kept her from being misled by the faulty interpretation of the original analysis. Thus, the difference between the two pedagogical models is that SPC focuses on making sense of the data rather than on teaching a laundry list of tools and techniques of mathematical statistics. While the traditional curriculum is useful as a foundation for further study, and while it provides a basis for analyzing experimental data, the vast majority of the data in this world do not come from planned experiments. Today I teach several different seminars for nonstatisticians. Among these is a course in industrial experimentation that has an outstanding record of success. Yet I teach 10 seminars on SPC for every seminar on experimentation. This is because the bulk of the data that people need to understand is not experimental data. They need techniques for analyzing and interpreting observational data, and this is precisely what they get with a well-taught SPC class. Moreover, this ability to analyze observational data is precisely what is missing from the traditional curriculum. Now consider what has been happening over the past decade or so. The major thrust of industrial training has been to take the traditional curriculum, mix in some organizational material, and cram it all into four weeks of classes. One company I worked with taught its Black Belts a list of 56 statistical techniques. When I came around, the first question was always, “When do I use the different techniques?” In answer to this question I wrote another 400-page book. Since the primary question of analysis is always the question of appropriate homogeneity, and since the process behavior chart is a direct test of homogeneity, there is virtually no wasted time or effort in starting your analysis with a properly framed process behavior chart. If inferences are needed, much of the preliminary work has already been done. If inferences are inappropriate, then you will know that and can proceed to take appropriate action. Thus, the distance between collecting the data and taking appropriate action is minimized by the SPC approach. Moreover, as the dissertation of Dr. Charles Champ showed, when we use the Western Electric Zone Tests with an X chart or an Average chart, we have a technique that comes close to having the maximum possible power. Thus, when it comes to the analysis of observational data, the process behavior chart is a technique of unsurpassed simplicity and near maximum power. This combination makes it hard to beat. In training nonstatisticians, it is better to give them one technique that they can use and use well, than to introduce them to a grab-bag of techniques that they neither understand nor can use. So here is the problem. We have a whole generation of people who think they can do statistics because they can use some statistical software package. At the same time, they are clueless about their incompetence at avoiding the pitfalls inherent in data analysis, and the traditional curriculum does not teach them how to overcome this incompetence. With the availability of software today, people are going to be analyzing data. When it comes to training nonstatisticians the question is, do we want to teach them dozens of techniques so that they can use the software to (a) fit the input variables against themselves or to (b) completely misinterpret ANOVA results? Or do we want them to learn simple techniques that virtually everyone can use to analyze data and come up with appropriate and meaningful results? As Bill Scherkenbach says, “The only reason to collect data is to take action.” Taking the correct action requires insight, and insight is based on understanding. To this end we do not need more statistical techniques, but more managers, engineers, and scientists trained in the fundamentals of how to make sense of their data. In my experience, a well-taught SPC class achieves this objective much faster, and much more effectively, than the traditional curriculum of descriptive statistics, probability theory, and statistical inference that is focused on the analysis of experimental data. QDD: So how would you summarize your career? Wheeler: All in all, I would have to say that I was at the right place, at the right time, in order to serve the needed apprenticeship, to make my mistakes in the privacy of the university classroom, and to step into the role of consultant and author with the support and guidance of both professor David Chambers and Dr. W. Edwards Deming. Moreover, I had a supportive wife who was willing to let me take a chance as a consultant, and who was later able and willing to serve as my editor and publisher. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Dirk Dusharme is Quality Digest’s editor in chief.An Interview with Donald J. Wheeler
The statistician talks about Deming, quality issues, and his own interesting career
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Dirk Dusharme @ Quality Digest
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Comments
Dr. Wheeler's classes
Over the years, I have had the pleasure of attending Dr. Wheeler's classes on SPC and advanced SPC, as well as experimentation - some of them more than once. He has a plethora of books written on the subject and I still reference these materials today. The most solid training in SPC that you can find anywhere.
Tripp Babbitt
A need to get back to the basics.
What a great statement: "Or do we want them to learn simple techniques that virtually everyone
can use to analyze data and come up with appropriate and meaningful
results?"
In my experience a large percentage of quality managers don't even know how to properly conduct a brainstorming session. What chance have they with complex and mis-applied tools such as ANOVA and complex and useless practices such as data transformations ?