Featured Product
This Week in Quality Digest Live
Quality Insider Features
Mike Figliuolo
Sure, you have to be professional, but have a good time anyway
Margaret Graziano
Unlocking the power of organizational culture
Donald J. Wheeler
What are the symptoms?
Graham Ward
Asserting yourself and setting clear boundaries
Henning Piezunka
Businesses and leaders influence the kinds of ideas they receive without even realizing it

More Features

Quality Insider News
August 2023 US consumption totaled $219.2 million
New KMR-Mx Series video inspection system to be introduced at the show
Modern manufacturing execution software is integral for companies looking to achieve digital maturity
Study of intelligent noise reduction in pediatric study
Results are high print quality, increased throughput
Providing practical interpretation of the EU AI Act
The move of traditional testing toward Agile quality management is accelerating
Easy to use, automated measurement collection
A tool to help detect sinister email

More News

Davis Balestracci

Quality Insider

Another Strategy for Determining Common Cause

Cut new windows but try not to upset the daily routine

Published: Monday, November 5, 2012 - 15:36

Remember the early days of TQM? When I present to healthcare audiences, all I have to do is mention “lab turnaround time” to get a collective groan and smile. That was always one of the initial forays into what was called total quality management (TQM) or continuous quality improvement (CQI).

Most initial projects like lab turnaround time quickly turned into the project from hell due to over-collection of lots of vague data resulting from a huge cause-and-effect (Ishikawa) diagram that answered a vague question such as, “What causes long turnaround times?” It can be so tempting to naively jump right to today’s strategy before even knowing where to focus I’d like to revisit this situation using 20–20 hindsight to teach some lessons. (Of course, today you would first connect any project work to a “big dot” in the boardroom, right?)

Data strategies No. 1—Exhaust in-house data and No. 2—Study the current process could have been used to help get an approximate baseline of the problem’s extent (both in terms of length of time and occurrence), and some good stratification would help isolate and identify the lab procedures, departments, times of day, or days of the week to account for the majority of “too long” times (whatever that means—some good operational definition work would be helpful, too).

These initial studies of the current process would be based on an agreed global definition of “overall lab turnaround time.” One possibility might be the elapsed time from the test being ordered to the time the result is available to the patient... or should it be the physician? There had better be agreement.

For deeper issues regarding the “measurement process” input of this process, it would also be interesting to stratify these results by recording any words on the order, such as “STAT,” “STAT!!,” “Rush,”  or “Priority.” Do they really make any difference, i.e., result in lower turnaround times?

If a major opportunity is isolated, this “overall lab turnaround time” would now have to be broken down or “disaggregated” into its various components. Such component times might include transportation to the lab, waiting in the work queue, performing the test, communicating the result to the physician (or hall nurse), or time to get the result to the patient. Also, the definitions of each individual component would need to be clearly defined for everyone involved.

So, using the first two common cause strategies, you now have a project baseline and have isolated a major opportunity. You have no doubt also learned about how to handle that pesky “human variation” and realized the need to be quite formal about special data collections.

What’s next?

Common cause strategy No. 3: Cut new windows (process dissection)

Juran called his third data strategy “cut new windows” (Brian Joiner calls it “disaggregation” or “process dissection,” interchangeably). This takes strategy No. 2—study the current process (i.e., stratification) one step further by gathering data that are, once again, not routinely collected.

The purpose is to further focus—within a major identified opportunity—to identify an even deeper source of variation that is hidden in the aggregate overall performance measure. The process now must be deeply dissected beyond any routine requirements or easily obtainable data.

The good news is that this must be done only for the “20 percent of the process causes 80 percent of the problem” (aka the Pareto principle), which offers a high probability of being productive. However, because of the level of detail needed, this task requires much more disturbance in the daily workflow to get at these data. Unlike “study the current process,” where one easily records data that are virtually there for the taking, this next step involves breaking a process into its subprocesses, which will take significant time and effort on the part of the data recorders.

Deeper focus

The vital processes have now been “dissected” into their subprocesses. There might even be an additional factor for considerations regarding potential collection. For instance, the previous analyses may have exposed that it is only necessary to collect these data for one or several specifically isolated time periods during the day. But the analyses might also have shown that Mondays or weekends had unique, specific problems vs. a more typical day.

Breaking apart the total time is, in effect, cutting a new window into the process—i.e., dissecting it to look at different perspectives and smaller pieces. And note: This level of detail is done only on the tests that have been identified—using Pareto analysis of the data collected in strategies one and two, the “20 percent of the test types causing 80 percent of the excessive test times”—to be a major problem.

With such a focused data collection, gathering the required information will involve no more personnel than necessary; data collection in other areas would add little value for the effort expended.

Further benefit: This study of the process for lab procedures with the most problematic turnaround times might also ultimately provide some benefit for all procedures.

For planning purposes, previous work on an appropriately detailed flowchart would expose the best leverage points for data collection and allow the creation of an effective data collection sheet.

Use process dissection only after a significant source of variation has been exposed and isolated. Realize that it is a very common error for teams initially to jump directly to this step. This not only results in too much data, but more important, it inconveniences people who don’t need to be inconvenienced—a mistake I have also made in the past more than I care to admit. The data collection has no ultimate benefit for them, and your credibility as a facilitator becomes suspect.

As you know, this strategy (actually, any strategy) is a major upset to daily routine. Make sure that its use has been preceded by the only mildly inconveniencing data strategies No. 1 and No. 2 to isolate a major opportunity on which to focus. Make sure any perceived additional work by a culture is also perceived as ultimately valuable. This will enhance your reputation as well as create better organizational “beliefs” about quality improvement.

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.