Donald J. Wheeler’s picture

By: Donald J. Wheeler

The keys to effective process behavior charts are rational sampling and rational subgrouping. As implied by the word rational, we must use our knowledge of the context to collect and organize data in a way that answers the interesting questions. This column will show the role that sample frequency plays in constructing an effective XmR chart.

One of my clients had an online temperature gauge that could be sampled at different frequencies. The process engineer wanted to use these data to create a process behavior chart. He began by sampling the temperature once every 128 seconds, resulting in a sample frequency of 28 times per hour. The resulting XmR chart is shown in figure 1.


Figure 1: Temperature sampled 28 times per hour

This chart shows 50 consecutive readings that span almost two hours of production. The process is being operated predictably, with an average near 19°C. The observed temperatures varied from 16.5° to 21.6°, while the natural process limits range from 14.9° to 22.7°. Thus, unless something changes, these process temperatures can be expected to vary from 15° to 23° in the future.

Paul Laughlin’s picture

By: Paul Laughlin

Continuing our thinking about ways for data leaders to save money during a recession, this article drills into saving on your data usage. Following my last post reminiscing on the lessons I learned during past recessions, the early environmentalist slogan “reduce, reuse, recycle” has stayed in my mind. Beyond the workload and team thinking, I’ve begun to muse about how those approaches could apply to data.

Donald J. Wheeler’s picture

By: Donald J. Wheeler

As the foundations of modern science were being laid, the need for a model for the uncertainty in a measurement became apparent. Here we look at the development of the theory of measurement error and discover its consequences.

The problem may be expressed as follows: Repeated measurements of one and the same item, if they are not rounded off too much, will frequently yield a whole range of observed values. If we let X denote one of these observed values, then it is logical to think of X as having two components. Let Y denote the actual, but unknown, value of the item measured, and let E denote the measurement error associated with that observation Then X = Y + E, and we want to use our repeated measurements to find an estimate for Y in spite of the uncertainties introduced by the error terms E. As it turns out, the best estimate to use will depend on the properties of the distribution of the error terms E.

Multiple Authors
By: Donald J. Wheeler, Al Pfadt

In memory of Al Phadt, Ph.D.

This article is a reprint of a paper Al and I presented several years ago. It illustrates how the interpretation and visual display of data in their context can facilitate discovery. Al’s integrated approach is a classic example not only for clinical practitioners but also for everyone who needs to turn data into knowledge.

This is an example of how process behavior charts were used to (1) evaluate outcomes; and (2) assist in making clinical decisions in the treatment of severe, potentially life-threatening, self-injurious behavior (viz., self-inflicted wounds to the body caused by head-banging and biting the wrists and fingers). The treatment of Peter, a 25-year-old man with autistic disorder who functions in the severe range of intellectual disability and has been blind since birth, is described from two points of view. First, from the perspective of Dr. Al Pfadt, the behavioral psychologist who constructed and analyzed the charts shown here; then from the perspective of Peter’s parents. The following material was provided by Pfadt and his colleagues at the New York State Institute for Basic Research in Developmental Disabilities and is used with the permission of Peter’s parents.

Alan Metzel’s picture

By: Alan Metzel


Almost seven years ago, Quality Digest presented a short article by Matthew Barsalou titled “A Worksheet for Ishikawa Diagrams.” At the time, I commented concerning enhancements that provide greater granularity. Indicating that he would probably have little time to devote to such a project, Barsalou graciously invited me to expand upon his work. As one of the few positive outcomes of the recent storms that have raged across the United States, I’ve finally completed that task. In thanks to Barsalou—and with tribute to his mentor—I refer to this effort as the “Enhanced Perkin Tracker.”

Story update 1/23/2023: A previous version of this story linked to the wrong Excel file. That Excel link is now correct.

For those who might be unfamiliar with the Ishikawa diagram, it’s a graphic problem-solving tool used to relate multiple potential causes to a single effect in a rational manner. Based on its shape, it’s easy to understand why it’s often referred to as a “fishbone” or “herringbone” diagram.

Donald J. Wheeler’s picture

By: Donald J. Wheeler

The computation for skewness does not fully describe everything that happens as a distribution becomes more skewed. Here we shall use some examples to visualize just what skewness does—and does not—involve.

The mean for a probability model describes the balance point. The standard deviation describes the dispersion about the mean. Yet a simple description of skewness is elusive. Depending on which book you read, skewness may be described as having something to do with the relative size of the two tails, or with the weight of the heavier tail of a probability model.

By far the easiest way to understand what increasing skewness does to a probability model is to compare models with different amounts of skewness. But before we can do this, we have to first standardize those models. This is because skewness is defined in terms of standardized variables; skewness is what happens after we have taken into account differences in location and dispersion. (If we compare two distributions that have not been standardized, differences in location and dispersion may obscure differences in skewness.) So here we will be working with standardized distributions where the mean is always zero and the standard deviation is always equal to one.

William A. Levinson’s picture

By: William A. Levinson

Corrective action and preventive action (CAPA) is probably the most important process in any quality management system because so much else depends on it. This includes not only its traditional role as a response to defects, nonconformances, customer complaints, and audit findings, but also outputs of the management review. It can even address all seven Toyota production system wastes if we redefine as a “nonconformance” any gap between the current state and a potential or desirable future state. AIAG’s CQI-22, Cost of Poor Quality Guide1, recommends that we compare “the ideal state for how work processes should perform” against “the current reality.”

Inadequate CAPA is a leading source of ISO 9001:20152 and IATF 16949:20163 findings, and FDA Form 483 citations.4 “Five Signs Your Company Is in Dire Need of Root Cause Analysis and Corrective Action Training” reinforces this point even further.5 While ISO 9001:2015 doesn’t have a specific requirement for preventive action, one could argue that clause 6.1.1 (c), “prevent, or reduce, undesired effects,” constitutes an implied requirement.

Donald J. Wheeler’s picture

By: Donald J. Wheeler

The cumulative sum (or Cusum) technique is occasionally offered as an alternative to process behavior charts, even though they have completely different objectives. Process behavior charts characterize whether a process has been operated predictably. Cusums assume that the process is already being operated predictably and look for deviations from the target value. Thus, by replacing process characterization with parameter estimation, Cusums beg the very question process behavior charts were created to address.

To illustrate the Cusum approach and compare it with an average chart, I’ll use the example from page 20 of Shewhart’s first book, Economic Control of Quality of Manufactured Product (Martino Fine books, 2015 reprint).These data consist of 204 measurements of electrical resistivity for an insulator. Shewhart organized them into 51 subgroups of size four, based upon the time order in which the measurements were obtained. Figure 1 gives the averages and ranges for these 51 subgroups.

OpusWorks’s picture

By: OpusWorks

Over two days, engage in eight unique best practice sessions with 11 process improvement and thought leaders at S.O.A.R. 2022, OpusWorks’ annual virtual conference.

Designed to present highly actionable information and game-changing strategies from highly experienced and inspiring human beings, S.O.A.R. will enable you to better lead your organization-wide transformation by showing you how to:

Systematize processes
Operationalize excellence
Accelerate scaling
Resolve to innovate

Day One Agenda, Wednesday, September 28, 2022

10:00: Rapid Scaling with OpusWorks in 2022

Rob Stewart, OpusWorks CEO
Dan Rice, OpusWorks COO
Vickie Kamataris, Chief Content and Delivery, MBB,  OpusWorks Institute (OWI)

After Rob kicks off S.O.A.R. 22, Dan and Vickie will provide their perspectives about the OpusWorks solution set in the context of today’s challenges. They will also update attendees on what’s new from OpusWorks since S.O.A.R. 2021:

Harish Jose’s picture

By: Harish Jose

In today’s column, I’m looking at the Ohno Circle in light of German philosopher Martin Heidegger’s ideas. I’ll try to stay away from the neologisms used by Heidegger and will only scratch the surface of his deep insights.

One of the best explanations of the Ohno Circle comes from one of Taiichi Ohno’s students, Teruyuki Minoura, the past president and CEO of Toyota Motor Manufacturing North America. He had firsthand experience of it. Minoura noted: “Mr. Ohno often would draw a circle on the floor in the middle of a bottleneck area, and he would make us stand in that circle all day long and watch the process. He wanted us to watch and ask “Why?” over and over.

“You may have heard about the five ‘whys’ in TPS. Mr. Ohno felt that if we stood in that circle, watching and asking why, better ideas would come to us. He realized that new thoughts and new technologies don’t come out of the blue—they come from a true understanding of the process.

Syndicate content