Featured Product
This Week in Quality Digest Live
Quality Insider Features
Ophir Ronen
Ushering in a new era of data-driven hospitals
Jacob Bourne
Combining computers, robotics, and automation drives efficiency and innovation
Gleb Tsipursky
Here’s the true path to junior staff success
Nathan Furr
Here’s how to balance psychological safety and intellectual honesty for better team performance
Massoud Pedram
An electrical engineer explains the potential

More Features

Quality Insider News
Introducing solutions to improve production performance
High-performance model extends vision capability
June 6, 2023, at 11:00 a.m. Eastern
Improving quality control of PCBAs and optimizing X-ray inspection
10-year technology partnership includes sponsorship of quality control lab
Research commissioned by the Aerospace & Defense PLM Action Group with Eurostep and leading PLM providers
MM series features improved functionality and usability
Improved design of polarization-independent beam splitters

More News

Davis Balestracci

Quality Insider

PDSA… or Rock of Sisyphus?

PDSA is a messy, ugly, nonlinear process... that you absolutely need

Published: Wednesday, April 16, 2014 - 17:43

Critical thinking does not necessarily result from using a practitioner’s toolbox framed within a formal improvement structure such as Six Sigma, lean, lean Six Sigma, or the Toyota Production System. It’s very easy to get seduced by all the fancy tools, acronyms, and Japanese terminology—and promises. I’d much rather have far fewer tools used in conjunction with critical thinking to understand variation—in one’s native language. So again, it’s time to apply critical thinking to rapid-cycle PDSA, the plan-do-study-act method for learning and improvement developed by Walter Shewhart. If you find yourself confused and frustrated in your current applications, there’s good reason.

In the excellent article “Building Knowledge, Asking Questions” (BMJ Quality and Safety, April 2014) the authors write: “Questions regarding implementation in a specific setting, integrating evidence into practice, or improving the efficiency of local systems are often best answered using methods that differ from traditional methods of clinical research (i.e., controlled clinical trials [and traditional statistics]). While a number of formal methods exist for implementing quality improvement in practice... all advocate the use of small tests of change. Small tests of change enable one to learn how a particular intervention works in a particular setting. The goal of these methods is not to test a hypothesis but rather to gain insight into the workings of a system and improve that system.” [emphasis added]

Refereed healthcare journals still don’t “get” improvement and come down very hard on its lack of rigor. If this variation and lack of rigor is present in the efforts of practitioners who are consciously looking ahead to potential publication (and what practitioner isn't), that doesn’t bode well for the improvement method’s more routine use in everyday work environments. If people who are trying to impart rigor can’t do it, what would you expect of the “average” practitioner? The further lack of rigor resulting from inherent human variation increases by at least one order of magnitude.

There is substantial variability in the way that PDSA cycles are designed, executed, and reported. In “Building Knowledge, Asking Questions” the authors note, “Fewer than 20 percent of papers documented a sequence of iterative cycles, and only about 15 percent of articles reported the use of quantitative data at monthly or more frequent intervals to inform the progression of cycles.... Collecting data less frequently than monthly hardly seems like rapid cycle improvement.”

Just about all of us have seen the image in figure 1, which is used to sell the concept of rapid-cycle PDSA: a smooth progression of cycles, with each seamlessly and iteratively building on the previous. The theory is that as the number of cycles increases, their effectiveness and their overall cumulative effect strengthens.


Figure 1:Traditional view of successive plan–do–study–act (PDSA) cycles over time depicted as a linear process. Each preceding PDSA informs the next one. As time goes on, the complexity of each intervention and trial often increases.” (Courtesy of “Building Knowledge, Asking Questions.”) [Click here for free full text access, which will allow you to download this image and the image in figure 2 as PowerPoint slides.]

 

Those of us who live in the real world know better. Application requires important nuances and the uneven, dynamic, and messy reality of implementation. Any environment has its own unique challenges and opportunities (i.e., the 20 percent of its process causing 80 percent of its problem). Change creates an interplay that is rarely neat and linear, and it is very culture specific.

Anyone who makes the process sound as simple as the PDSA image in figure 1 so they can sell you their services or presents a demo of a pristine application at a conference is either eager for your money or naïvely (and dangerously) unconscious of reality. In either case, don’t trust any touted results and ask lots of questions. If you apply critical thinking to your current efforts, you will run rings around these neatly packaged and sanitized examples—guaranteed.

As many of us have discovered, application involves frequent false starts, misfires, plateaus, regroupings, backsliding, feedback, and overlapping scenarios within the process—more like the image in figure 2, which is hardly perfect circles rolling up the hill of change! Rather, it is a complex tangle of a network with numerous starts, stops, backtracking—and often incomplete cycles of change.


Figure 2: “Revised conceptual model of plan–do–study–act (PDSA) methodology.” (Courtesy of “Building Knowledge, Asking Questions.”) [Original article includes deeper discussion of this image. To access, click here and scroll down to reference No. 4; click on “FREE full text,” then click on “Full text” in right margin.]

 

As we all know, the naïve simplicity of upward linear progress is also a myth. Not all cycles have equal impact on project development and therefore vary in size, and not all cycles are completed. Positive or negative residual effects from a cycle can linger. Cycles at varying levels of success interact with other cycles in various stages of the PDSA cycle. Some cycles explore or define limitations or setbacks, and it’s not until later cycles that the challenges are harnessed to make improvement possible. And even that may not always happen.

Although the results of quality improvement efforts are important, the context within which any design and execution took place is equally important. What unexpected “variation” was encountered? In other words, what was learned about the improvement process itself and how will it be improved for future interventions? What aspects of the specific environment were relevant to the effectiveness of the intervention? What were the elements of the local-care environment considered most likely to influence change and improvement?

Determining the success of an intervention should include the aspects of the local context relevant to the theory of the intervention. As W. Edwards Deming vehemently said many times, “Examples without theory teach nothing!” The information gained from answering these questions supplies the necessary details to interpret the results and answer the crucial question: Will a similar intervention work in a different setting?

Despite the messy reality, proper execution of PDSA cycles as a strong, reliable methodology for improvement remains fundamental. Its intuitive nature provides the discipline necessary for good critical thinking and appropriate formality in the midst of inevitable “variation” that is present everywhere—including the variation in variation experienced among similar facilities.

I have a strong opinion that research to develop new knowledge is just about impossible in an everyday work environment—too much human variation. So, for most improvement practitioners, doesn’t it all boil down to two questions?

• Why does routine care delivery (or product quality) fall short of standards we know we can achieve?

• How can we close this gap between what we know can be achieved and what occurs in practice?

And beware of the person patronizing you by selling allegedly simple solutions. As the other quality giant of the 20th century, Joseph Juran, said, “There is no such thing as ‘improvement in general.’”

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.

Comments

Great analogy

Sisyphus: the perfect metaphor!

Good article

The sub-title is spot on. Figure 2 is a nice touch, too. In addition to the cluster of PDCAs in Figure 2, another diagram might convey another dimension, showing PDCA spinning away at every level of the organization--the system level, process level, activity level--pervading all work affecting quality. It would get very messy very quickly. Thanks for a good article!

Application to a Call Center

Great article, Davis, as usual. The mention of Sisyphus in this context reminded me of another article that alludes to the same myth in a similar way (http://www.isixsigma.com/operations/call-centers/futility-call-center-co....)