Featured Product
This Week in Quality Digest Live
Quality Insider Features
Ophir Ronen
Ushering in a new era of data-driven hospitals
Jacob Bourne
Combining computers, robotics, and automation drives efficiency and innovation
Gleb Tsipursky
Here’s the true path to junior staff success
Nathan Furr
Here’s how to balance psychological safety and intellectual honesty for better team performance
Massoud Pedram
An electrical engineer explains the potential

More Features

Quality Insider News
Introducing solutions to improve production performance
High-performance model extends vision capability
June 6, 2023, at 11:00 a.m. Eastern
Improving quality control of PCBAs and optimizing X-ray inspection
10-year technology partnership includes sponsorship of quality control lab
Research commissioned by the Aerospace & Defense PLM Action Group with Eurostep and leading PLM providers
MM series features improved functionality and usability
Improved design of polarization-independent beam splitters

More News

Davis Balestracci

Quality Insider

‘What’s the Trend?’

Wrong question!

Published: Monday, August 11, 2014 - 11:13

In my last column, I showed how hidden special causes can sometimes create the appearance of common cause, but the purpose of common-cause strategies is to deal with this and smoke them out. When there's an underlying structure to how these data were collected, or when one can somehow code each individual data point with a trace to a process input, stratifying the data accordingly can many times expose these special causes.

Even the simple coding of individual points on a graph can be every bit as effective as the more formal tool of a stratified histogram. I’m going to take the issue of understanding variation in count data further in the next couple of columns. I’ll begin here by looking at two scenarios.

‘What's the trend?

I once consulted with a medical center and attended its monthly meeting on medication errors. Right before the meeting, I was handed the 24 monthly reports from the previous two years. Not being sure what to do, I noticed something in the top right corner of each: a small table citing the number of errors for this month, last month, and same month last year.

The meeting began, and I tuned out the people asking, “What’s the trend?’ I used these 24 data tables to sketch a run chart of the past three years of performance. It looked like this:

Applying the usual run chart rules, it is common cause, but do you see a pattern to the months with high values? I stopped the meeting dead in its tracks by asking, “What happens in July?’ Turns out, besides a lot of vacations, that’s when the new medical residents start. Do you think that might be the reason? In fact, because this seems to be a predictable event, why not try to prevent it?

Let’s look at a similar meeting. A manufacturing plant had 45 accidents one year and set an aggressive goal for the next year of reducing them by at least 25 percent. The subsequent total was 32—a 28.9-percent decrease. However, a trend analysis (below) showed the decrease was more on the order of 46.2 percent (4.173 vs. 2.243).

Kudos to the safety committee and its hard work at the monthly safety meetings, during which each month’s accidents are individually dissected to find the “root cause”—i.e., each undesirable variation (or accident) is treated as a special cause. Also note there are three months with zero events. The reasons for these were also discussed and used to evaluate the effects of previous recommended solutions. Based on these data, approximately 80 actions (reactions to 77 accidents plus three months of zero) have been implemented during the past two years, including new policies, new reporting forms, new awareness posters, additional visual safety alerts in “dangerous” places, and plant safety meetings. All this hard work must have surely paid off.

25 percent is a ‘big’ number, but it’s just a number

Key question: Is the process that produced this year's 32 accidents truly different from the process that produced the 45 accidents from the year before? If the answer is yes, one should be able to create a run chart of individual monthly results and see at least one of three things:
1. A trend of six successive decreases and/or
2. A run of eight consecutive points above the median during the first year and/or
3. A run of eight consecutive points below the median during the second year

Take a look at the run chart below. What do you think?

What part of ‘no trend lines’ don’t people understand?

Common cause—and an expected monthly range of zero to eight! What has been the effect of the special-cause strategy of looking at each accident individually (and scraping it like a piece of burnt toast)? The chart strongly suggests that this strategy doesn't seem to be working. There is no evidence of improvement. Even though accidents “shouldn't” happen and the company met an aggressive reduction goal, this facility remains "perfectly designed" to produce them—at the same rate.

Suppose that no coding of the individual observations helped? What now, a common occurrence with incident data or nonconformance data? One does not have to accept this level of performance, but to paraphrase a favorite saying of W. Edwards Deming's, “Statistics on the number of accidents by themselves don't help to improve the number of accidents.” Nor is the answer a total process redesign with the vague objective of “reducing accidents.”

Do I see the Ishikawa cause-and-effect diagram from hell in this manufacturer’s future as it attempts to determine “what causes accidents?” I hope not.

The fact that the plot demonstrates common cause means that neither individual monthly points nor individual accidents can be treated as special cause—even if they can be explained after the fact. Where does one go from here? To common-cause strategies. There are additional tools, neither commonly taught nor taught with this in mind, that are most enlightening. We’ll talk about these more in my next column.

Discuss

About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.

Comments

Cause & Effect Diagram

What's wrong with using a C&E diagram to deal with common causes?

I find them very useful.

 

Rich D

"Vague"

If you want to do a HUGE Ishikawa diagram brainstorming "What causes medication errors" or "What causes accidents," be my guest.

I'd rather do a high-level stratification FIRST to find out the 20% of the process causing 80% of the problem first, THEN doing an Ishikawa diagram.

Three sources could get exposed: (1) a certain department ALREADY DOING GOOD WORK could have a problem with one particular medication/accident type (due to a unique input), (2) certain departments might have an OVERALL problem with their "safety" or "medication prescribing" process, and (3) there might be certain accidents or error types that are being made by EVERYONE -- which would be a system problem.  These will be far more FOCUSED issues.

Plus...to dicover this, you will have to collect a lot LESS data than what would result from a huge Ishikawa no doubt implementing "vague" ideas suggested by "good people"...and getting "vague" results...and making a lot of people mad in the process.

If this makes you feel any better, Mea culpa!  I learned the above from Joiner's brilliant "Fourth Generation Management" book...and I'm making a lot fewer people mad these days by not wasting their times collecting data that doesn't ultimately help them or anyone else.

More about that in my next article...

Thanks for reading.

Ban The Phrase "Trend Line" From Business!!!

Excellent article, Davis, and good examples, as usual. The hairs on the back of my neck always stand up when someone in a meeting presents a chart of data with a "trend line". Unfortunately, the software companies have made it far too easy to add a so-called "trend line" line that has no meaning to a mass of data. People love to add a trend line, no matter how slight, to support whatever premise they are trying to prove. "Oh, look! We're trending in the right direction." I usually ask for the raw data and 99% of the time, find that a scatter plot with 95% confidence limits for the least squares fit line clearly shows that the "trend" is not significant. Pass the barf bag, please. I have learned to hold my tongue in meetings (If you know me, you know that ain't easy for me!) and instead have private discussions with the perpetrator of "bad statistics", even though the damage is already done. The downside is that this usually means more work for me! OK..."Teach a man to fish and he will"...continue to ask you to analyze his/her data for him/her. Run charts as very easy to plot and analyze. We MUST find a way to use them and, in most situations, be able to ban the phrase "trend line" from presentations. Time-ordered data is precious and should be properly analyzed.