Featured Product
This Week in Quality Digest Live
Six Sigma Features
William A. Levinson
Quality and manufacturing professionals are in the best position to eradicate inflationary waste
Donald J. Wheeler
What does this ratio tell us?
Donald J. Wheeler
How you sample your process matters
Paul Laughlin
How to think differently about data usage
Donald J. Wheeler
The origin of the error function

More Features

Six Sigma News
How to use Minitab statistical functions to improve business processes
Sept. 28–29, 2022, at the MassMutual Center in Springfield, MA
Elsmar Cove is a leading forum for quality and standards compliance
Is the future of quality management actually business management?
Too often process enhancements occur in silos where there is little positive impact on the big picture
Collect measurements, visual defect information, simple Go/No-Go situations from any online device
Good quality is adding an average of 11 percent to organizations’ revenue growth
Floor symbols and decals create a SMART floor environment, adding visual organization to any environment
A guide for practitioners and managers

More News

Davis Balestracci

Six Sigma

The Universal Process Flowchart × 4

This tends to get a whole lot of emotional interpretation flowing

Published: Monday, April 7, 2014 - 15:52

To summarize my last three articles, most improvement approaches come out of the same theory and are based on the assumption that everything is a process.

The universal process flowchart in Figure 1 sums it up beautifully. The boxes at the top are how executives think the process works. The frontline’s frustrating reality is charted underneath and is how it really works, which is why it needs improvement.

Figure 1: Universal process flowchart

In my last article, “Finding the Unnecessary and Everyday Variation,” I challenged you to give serious consideration to “the current everyday use of data” as a process, one that needs improvement. So let’s apply a tool common to all approaches—a high-level flowchart (shown in figure 2).

Figure 2: A high-level flowchart

The current use of data has actually four processes:
• Measurement
• Collection
• Analysis/display
• Interpretation

Each of these processes potentially has the six classic sources of process inputs: people, methods, machines, materials, measurements, and environments. Each process also has a “quality” associated with it and is subject to outside variation that can compromise this quality. Unless the variation in these processes is minimized, there is a danger of reacting to the variation in the data process and not the process one is trying to understand and improve.

And what is the biggest danger? Human variation: in perceiving the variation being studied (objective), in the process of defining a literal number to collect (measurement), and in actually executing the “methods” of measurement, collection, analysis/display, and interpretation.

Although measurement and collection are very important, the critical thinking needed to address them will be motivated by looking at any data’s routine current use. And most of this use takes place during scheduled meetings involving data.

Aren’t many of these meetings scheduled because of undesirable variation? These include operational performances compared to goals, financial performances compared to budgets, comparisons using benchmarks, unforeseen incidents, and project progress, to name a few. The goal of these meetings is to reduce the gap (variation) between what is observed vs. what is desired with the hope of taking actions to drive the goal toward the latter.

How are data typically used? Either raw data are handed out, and the meeting is analysis/display and interpretation, or (alleged) analyses and displays have been done beforehand, and the meeting is interpretation. The result of either is some type of action that becomes an input to the everyday work processes.

What is the process by which a group of executives, managers, or workers looks at the data and any analyses/displays to take action?

Don’t many currently favored displays have a danger of causing human variation in perceiving and interpreting variation? For example, in data tables, bar graphs, trend lines, percentile ranking, or variance (from goal), analysis people draw circles around numbers they “don’t like.” These result in people observing variation between a number on the page vs. what they “feel” it should be, but there is variation in how people perceive this variation and what people consider “too much.” Some people also see “trends,” but some don’t.

There is lack of common theory for interpretation. Because of this, there is variation not only in how the people in the room perceive variation, but also in the recommended actions for closing the perceived undesirable gap. And this doesn’t even take into account additional potential variation: whether the data’s definition allows them to use it for this purpose or whether it was even collected appropriately.

In other words, the meeting is pretty much using this analysis “tool”:

Why is there lack of theory? Because all variation is considered special cause, needing only explanation from individual expertise to interpret, then take action. With everyone doing his or her best, recommended actions will come from individuals’ unique areas of expertise, resulting in more variation and argument... and “not enough time” to test proposed solutions. Somehow, someone’s wins.

One needs theory to make the crucial distinction between whether the observed variation is due to special causes or common causes. Depending on the source, completely different approaches will be required to deal with it.

How could you look at “the current use of data” and apply your improvement expertise? Look at a sample of any routine meeting’s raw data and objectively ask:
• What could these data tell you?
• What is actually being done with these data?
• What graphical displays are being used?
• What other displays are being used, e.g., traffic lights, rolling averages?
• What is the reliance on using tables of raw numbers?
• Is any difference between two numbers being treated as a special cause (e.g., smiley faces, “thumbs up or down,” month-to-month comparison, variance to goal)?
• What actions result from these data? Do they consider whether the variation being acted on is common or special cause?
• A key question/skill: Does a plot of these data over time exist? How could you construct one?

 

Starting to feel uncomfortable? Good!

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.