Our PROMISE: Our ads will never cover up content.
Our children thank you.
Davis Balestracci
Published: Wednesday, March 19, 2014 - 12:14 This is the last in my series making the case that the various improvement approaches are all pretty much the same. There are seven sources of problems with a process. The first three help frame the situation: In my last column, I talked about: Source 4: Errors and mistakes in executing procedures Source 4a: Sentinel events, when everything that can go wrong goes wrong at once for one patient or customer—inevitably (i.e., common cause) The fifth source considers the human and environmental design factors that keep the process from working as it “should.” Source 5: Current practices that fail to recognize the need for preventive measures. These include: The sixth source addresses unnecessary complexity. Source 6: Unnecessary steps, inventory buffers, wasteful measures and data, including: In addition to considering these issues for the process at hand, I feel that the last element requires formal consideration as part of any project. Look at the value and quality of the process’s current default data process (e.g., collection, analysis and display, interpretation, actions taken). Is it adding value? More about this when I talk about Source 7. But before I do, allow me to digress a bit. There at least three well-known examples that use process-oriented thinking. The first, Six Sigma, emphasizes creating a process that is consistently, virtually defect-free (or free of undesirable incidents). From what I’ve observed, there is a tendency to study the observed, undesirable variation to intuit potential sources inherent in sources one through six as “theories to be tested” using planned data. The road map outlined in my last two columns helps make this process more robust. Next, key elements of lean emphasize initial formal documentation to obtain the true current process (Source 2: Reducing “human variation” in perception), appropriate error-proofing (Source 5), and exposing waste (Source 6), generally in terms of unnecessary complexity. Lean looks at an entire process with the ultimate goal of having only value-added work (i.e., work that benefits the customer). Anything else is considered nonvalue-added work, such as: Much of this is often designed for organizational convenience or to formally rework process errors that should not even be occurring. Finally, the Toyota Production System (TPS) takes the concept of “inventory buffers” (Source 6) one step further to an obsession with all aspects of wasted time that keep a process from “flowing.” In other words, how does one avoid the various aspects of inherent “batching” so ingrained in current work cultures... usually for the convenience of the workers? According to Taiichi Ohno, one of the TPS creators, “The Toyota mind develops brilliant processes in which average employees may excel.” Unfortunately, in healthcare it would be more true to say: “Healthcare systems have discontinuous processes in which brilliant staff struggle to produce average results.” Process-oriented thinking is the anchoring concept of any sound improvement framework. It creates a common organizational language that looks at any undesirable variation objectively to reduce blame and defensiveness. Statistics has a huge role in all of this—and it’s not the legalized torture most of you had to endure to get your colored belts or certifications. The “value-added” work of the statistical education process should be the ability to apply critical thinking to understand and reduce inappropriate and unintended variation. I think we have a slight gap between how this process “does” work and how it “should” work. Unintended human variation in the perception of any given situation—including any data’s objective and operational (numerical) definition—will most likely render any existing or collected data worthless for improvement. Well, not exactly: If people realize that the data have no value due to this human variation, it will motivate the need to improve the data process for subsequent collections. In the context of the six sources of problems already discussed, statistics and data are key to: Source 2: Inadequate knowledge of how a process currently works. Source 3: Inadequate knowledge of how a process should work Source 4: Errors and mistakes in executing procedures; things happening that shouldn’t happen (sentinel events) Source 7: Variation in inputs and outputs—dealing with a process’s “everyday” variation. The four points above indeed apply to any process under study: What data are available, and how has the organization routinely dealt with them? How will the current project improve the future use of these data—including holding any gains? But think beyond the data for any specific project. Couldn’t dealing with Source 7—let’s call it the “everyday data process”—be an ongoing project in and of itself? Since management continues to be obsessed with cutting costs, let’s play their game. Can I challenge you to calculate a cost of poor quality for routine time-wasting meetings involving vague performance or financial data, special cause reactions to problems that are common cause, and performance review meetings explaining current results vs. (arbitrary) numerical goals and budgeting? I would highly recommend adding in these additional sources suggested by Mark Graham Brown, a Baldrige Award, balanced scorecard, and data analytics expert: one hour per day for each middle manager poring over useless operational data, in addition to the cost of publishing these routine data reports. Sixty percent of published operational reports and 80 percent of published financial reports are waste. Then estimate. Look at a two-week or one-month sample of leaders’ and middle managers’ schedules. For meetings involving data, multiply each meeting’s duration by the salaries plus benefits present in the room (all participants), sum it up for all such meetings, then project it out for a year. Add in Mark Graham Brown’s sources if you wish. I predict that the figure obtained would be shocking. Keep management in suspense by presenting it as “the biggest problem of which no one is aware.” Then watch the reaction to the figure. And when they lick their chops and demand to know what it is, watch the reaction when you tell them. I have yet to see someone do it. Can I challenge any of you to do so and tell me the result... even if you don’t want to risk telling your leadership? Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.Finding the Unnecessary and Everyday Variation
The last two sources of problems
Source 1: Inadequate knowledge of customer needs
Source 2: Inadequate knowledge of how the process currently works
Source 3: Inadequate knowledge of how the process should work
• How about isolating and focusing on the 20 percent of a process (vague problem) where most of the variation is occurring? This would be the time for more detailed flowcharting.
• Avoid a microscopic, root cause analysis approach, i.e., treating all such events as special cause.
• Physical environmental factors that make the process “perfectly designed” to have undesirable variation or incidents.
• Human fatigue—insufficient attention due to sensory overload
• Poor short-term memory
• Fixation on fixing things that go wrong that could have been easily avoided
• Reversion under stress
• Over-generalization
• Complexity added in the past due to inappropriate reactions to experienced variation, resulting in nonvalue-added work
• Implementing untested solutions
• Using poorly collected data to make decisions
• Routine data collections that are rarely or never usedExamples of process-oriented thinking
• Processing defects
• Overproduction
• Inventories
• Movement
• Excessive processing
• Transportation
• Waiting
• UnderutilizationReal statistics, not real torture
One of the two main reasons most projects fail is lack of a good baseline estimate of the extent of a problem (the other is too much detailed flowcharting). Do in-house data exist, or can a collection be designed to plot a simple chart for assessment, which then allows one to judge the effects of interventions?
This involves using statistical techniques to test competing theories or assess interventions to determine how the process truly “should work,” while holding any gains made going forward.
Data collected on patterns of errors, mistakes, and incidents can be studied to find hidden opportunities in a process:
• If some people or departments are making the mistake and others aren’t, then there is knowledge in the system to prevent the mistake—or expose inconsistency in trainers’ results.
• If everyone and all departments are making the mistake, then the process is perfectly designed to have the mistake occur. It will take an overall systemic intervention to fix the problem.Seventh source of problems with a process
• Daily managerial reaction based on anecdotal incidents or poorly collected data
• Scheduled quarterly or annual review meetings “accounting for” current results
• Meetings based on arbitrary numerical goals, including budgeting
• Routine scheduled meetings treating common cause as special cause
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Davis Balestracci
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Comments
Method for Changing Perspective is Missing
Hi Davis-
There is nothing in this that an improvement professional would argue with . . . for the most part. What is missing is a method for changing perspective. If you do all that is written here, you can still fail. The organization has not prepared itself for change.
For many organizations "process thinking" has led to more waste. Even in healthcare workers are "just following the procedure" ( http://www.qualitydigest.com/inside/quality-insider-column/just-follow-p... ). The variety of interactions in a service setting can lead to disaster in an inhibited "process-oriented" culture. I believe you fundamentally understand this.
The context and knowledge that workers need is an understanding of the end-to-end system. They can only get this through the insightful study of their system. By studying, they gain knowledge of the perspectives that influence their current design - these are cultural. Management and worker must be awakened about the realities of performance in the eyes of the customer and the cultural artifacts, perspectives and professed values.
You have a reduced chance of successful intervention without this important step.
Tripp Babbitt