Featured Product
This Week in Quality Digest Live
Management Features
Megan Wallin-Kerth
How GoFormz’ one-stop approach to documentation supports field-service workers
Rob Press
Four simple tips to optimize your workforce
Rashan Dixon
Leadership isn’t a formula. It’s an art.
Bryan Christiansen
Make maintenance programs more efficient and effective
Gleb Tsipursky
Strategies to navigate our increasingly disrupted environment

More Features

Management News
Connects people and processes across functional silos with a digital thread for innovation
Better manufacturing processes require three main strategies
Technical vs. natural language processing
Recognized as best-in-class industry technology by Printing United Alliance
It’s unethical for companies to use test tasks as free labor
Numerous new USB3 cameras added to product roster
Combines quality and process data from software and devices in holistic production-monitoring view

More News

Davis Balestracci

Management

Getting Real With Rapid-Cycle PDSA

The simplicity of smooth, upward, linear progress is a myth

Published: Tuesday, February 16, 2016 - 15:22

Marketers are relentless in their efforts to seduce you with fancy tools, acronyms, Japanese terminology—and promises—about their versions of formal improvement structures such as Six Sigma, lean, lean Six Sigma, or the Toyota Production System, each with its own unique toolbox.

In my last column, I discussed the need of becoming more effective by using far fewer tools in conjunction with critical thinking to understand variation.

In the midst of all this, I’ve seen W. Edwards Deming’s and Walter A. Shewhart’s brilliant, fundamental plan-do-study-act (PDSA) morph to an oft-spouted platitude. I laugh when I hear people casually comment that PDSA and plan-do-check-act (PDCA) are the same thing. They’re not.

And now the tool of rapid-cycle PDSA is increasingly popping up in some way, shape, or form. Healthcare has especially taken a “Go on, just do it!” attitude, encouraging anyone to test ideas in their routine work as a way to work around sluggish management. The rallying cry is: “What can we do now? By next week? By Tuesday? By tomorrow?”

Another case of ‘don’t just do something, stand there!

Rapid-cycle PDSA in a nutshell:
• Test on a really small scale. For example, start with one patient or one clinician at one afternoon clinic, and increase the numbers as you refine the ideas.
• Test the proposed change with people who believe in the improvement. Don’t try to convert people into accepting the change at this stage.
• Only implement the idea when you’re confident you have considered and tested all the possible ways of achieving the change.

What sounds so easy and commonsensical at a conference or in a paper...

Figure 1: Sounds easy. (Click here for source with free, full-text access, which will allow you to download figure 1 and 2 as PowerPoint slides.)

...hits your messy reality when you try apply it in an everyday environment.

Figure 2: Messy reality (Click here for larger image. Original article includes deeper discussion of this figure.)

The simplicity of smooth, upward, linear progress is a myth.

As many of us have discovered, application involves a complex tangle of frequent false starts, misfirings, plateaus, backslidings, and overlapping scenarios within the process. Not all cycles have equal impact on project development and therefore vary in size, and not all cycles are completed. Positive or negative residual effects from a cycle can linger. Cycles at varying levels of success interact with other cycles in various stages of the PDSA cycle. Some cycles explore or define limitations or setbacks, and it’s not until later cycles that the challenges are harnessed to make improvement possible. And even that may not always happen.

But don’t worry: There are guides to (allegedly) help you, e.g., example No. 1 and example No. 2. You’d better have an aspirin bottle handy; I never fail to get a headache when I look at “prescriptive straitjackets” such as these.

You’re on the road to becoming an advanced practitioner when you realize you don’t need to keep learning additional prescriptive advanced tools—and only 1 to 2 percent of you need advanced statistics.

Revisiting John Heider’s quote from The Tao of Leadership (Green Dragon Publishing, 2005):
“Advanced students forget their many options. They allow the theories and techniques that they have learned to recede into the background.
Learn to unclutter your mind. Learn to simplify your work.
As you rely less and less on knowing just what to do, your work will become more direct and more powerful.”

The hands-down best reference on PDSA remains The Improvement Guide. (Jossey-Bass, second edition 2009)

The implicit ‘plan’ of rapid-cycle PDSA seems to be: Come up with a reasonably good idea to test and then plan the test

There is a naïve assumption that it takes good people with good ideas doing their best to improve quality.

As Deming growled many times: “They already are... and that’s the problem!” and “For every problem, there is a solution: simple, obvious... and wrong!”

Paraphrasing my mentor Heero Hacquebord: The most important problems are not the obvious ones. They are the ones of which no one is aware.

Remember the quality circles disaster during the 1980s? The view from the 20th century quality giant, Joseph Juran, was that success of these efforts required the need to be grounded in an already strong, viable improvement culture with executive support. Actually, lean’s foundation of standardized work is probably one of the best ways to initially focus such a process.

There is a danger that many good ideas could be naively applied to the more obvious, superficial symptoms of deeper, hidden problems. In addition, one must inevitably deal with resistance and unintended human variation every step of the way while testing and then trying to implement the idea. And there’s the most nontrivial issue of collecting data during the change—usually an afterthought and planned ad hoc—if any is collected at all.

My respected colleague Mark Hamel has astutely observed:
• Human systems don’t naturally gravitate to discipline and rigor.
• Most folks are deficient in critical thinking, at least initially [my emphasis].

Today? Tomorrow? Next Tuesday? Next week? In an environment where leaders are not instilling discipline, prompting critical thinking, or facilitating daily kaizen?

Application of rapid-cycle PDSA requires important nuances and the uneven, dynamic, and messy reality of implementation. Any environment has its own unique challenges and opportunities—the 20 percent of its process causing 80 percent of its problem. Change creates an interplay that is rarely neat and linear, and it is very culture specific.

If you’re at a conference session or read a paper that makes the process sound as simple as smooth uphill linear progress, the author could either be naively (and dangerously) unconscious of the reality, or have sanitized his situation beyond recognition (especially acceptance of the change). In any case, don’t trust any touted results, and ask lots of questions.

Also ask yourself, “What would Deming say?” I’ve heard him say: “What’s your theory? Examples without theory teach nothing!”

If you apply critical thinking to your current efforts, you’re guaranteed to run rings around the results of any neatly packaged and sanitized example that at best presents the 20 percent of their process that solved 80 percent of their problem—in their unique culture.

By all means, use rapid-cycle PDSA

Remember: There is no avoiding being in the midst of variation everywhere—including the variation in variation experienced among similar facilities. Effectively applying rapid-cycle PDSA requires being conscious of the need to complement its intuitive nature with the discipline necessary for good critical thinking and appropriate formality.

This also includes improving the process of using it. After each cycle, ask:
• What unexpected “variation" was encountered? What was learned about the improvement process itself?
• How will it be improved for future interventions?
• What aspects of the specific environment were relevant to the effectiveness of the intervention?
• What were the elements of the local-care (or product quality)environment considered most likely to influence change and improvement?
• Will a similar intervention work in a different setting?

But remember the crucial importance of initially formulating strong theories to test. Make this a vital part of the “P.” Dialogue based on the following two questions could be a good place to start:
• Why does routine-care delivery (or product quality) fall short of standards we know we can achieve?
• How can we close this gap between what we know can be achieved and what occurs in practice?

Until next time....

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.

Comments

Invisible Low-Hanging Fruit

As you quoted Hacquebord: The most important problems are not the obvious ones.

Many consultants talk about the low-hanging fruit in an organization, but few can find it because it's invisible.

It's hiding in row after row of Excel spreadsheets about defects or in mainframe accounting, CRM and operating systems.

Once you get the data into Excel, you can use PivotTables to mine the data and find the invisible low-hanging fruit.

Then you can use control charts and Pareto charts to create improvement projects that will achieve breakthrough improvements.

Spot on, Jay...

...as always!

I appreciate your simple, but counter-intutitive, approach to improvement and am in total alignment with your efforts to stamp out the overuse of tools and certifications.

Davis

PDSA and PDCA

Davis, you are right on target on the differences between PDCA and PDSA. Ron Moen and I were personally instructred on the difference by Dr. Kano when we gave our talk in Tokyo in 2009. PDCA is aimed at implementation of a standard to meet a goal. JUSE, Dr. Mizuno and Dr. Ishikawa have always been clear on their intent.

Deming finally landed on PDSA in 1988 and was introduced in his four day seminars. It was published in the New Economics in 1993. Deming referred to it as the Shewhart Cycle, however what Deming introduced did not look anything like what Shewhart had in 1939. By 1993 the PDSA cycle had the idea of deductive and inductive reasoning built into the cycle. Deming worked on developing the PDSA cycle from 1986 to 1993.

When API first published it in 1991 in the book, Quality Improvement through Planned Experimentation, Deming was quick to warn Ron with the following note: “If it speaks of the PDSA cycle, be sure to call it PDSA, not the corruption PDCA.”  Deming, 17 November 1990

"Corruption" was fairly strong language for Deming.

Best regards,

Cliff Norman, API

Thanks, Cliff, for the clarification

As always, Cliff, a very thoughtful comment...and the reason why I recommend The Improvement Guide as the best resource for  truly understanding PDSA.

Davis