A Common-Cause Strategy for Count Data
In my September column's accident scenario ("Sick of Boring Meetings that Waste Your Time?") it was shown that, despite meeting an aggressive 25-percent reduction goal (i.e., 45 accidents during the first year and 32 the following year), the process that produced the 32 was no different from the process that produced the 45. It was common cause. Now what?
One advantage to the common-cause nature of the problem is that all 77 incidents were produced by the same process. Therefore, they can be aggregated, then stratified to reveal hidden special causes.
I came across the technique below in the classic Juran on Quality Improvement series from the 1980s (a seminal part of my quality improvement development). It's a Pareto matrix. In my presentations over the years, many people have told me that this has been by far their most useful diagnostic tool. Note the advantage of looking at the data in two dimensions--accident type and unit:
Two units account for most of the accidents, and two accident types account for many of the accidents. However, look at the power of the matrix presentation when the high numbers are investigated further.
Unit B, despite its many accidents, has excellent performance except for accident type three. Also, because no one else is having trouble with this accident type, the odds for rectifying this situation are quite good. It might not even reflect departmental competence but merely a different input to the work process (e.g., people, methods, machines, materials, measurement, environment) that's unique to that department and makes its work environment inherently more dangerous. Now, what would a plantwide safety seminar on accident type three accomplish? It would treat a special cause as if it were a common cause--and waste a lot of people's time in the process.
Unit E, on the other hand, has no such clear localized action. For whatever reason, its entire safety performance is suspect because it's experiencing all the accident types. This will take further investigation of Unit E's accidents.
After studying accident type five, it becomes obvious that it's a problem for the entire plant because everyone is experiencing it. It's not as simple as saying, "Be more careful." The plant is perfectly designed to have this hazardous situation.
If appropriate action could be taken on these three significant sources of undesirable variation, there's the potential to reduce accidents by 50 percent. But by concentrating only on the monthly total and overlooking this common-cause strategy that stratifies process "inputs"--i.e., people (or unit) and measurement (or type)--according to accident, Unit B would continue to have accident type three, Unit E would continue its poor safety performance and accident type five would continue unabated.
Last month's column also contained a medication error analysis that showed a special cause in the time plot every July. For that case, couldn't one do a Pareto matrix of just the July errors and a separate Pareto matrix of the other 11 months of data aggregated to see what must be done to prevent the special cause in July?
In July's column ("It's Time to Ignore the Traffic Lights"), the percent conformance to the goal of a system was consistent (i.e., common cause), which means that its percent nonconformance was also consistent. It also means that a nonconformance is a nonconformance, regardless of whether managers consider it in "red," "yellow" or "green" mode. (There was a tendency to look at every individual breach on "red" days.) Couldn't one look at the last 200 nonconformances during a stable period and stratify them via the matrix to look for hidden special causes?
It's much more productive to have people brainstorm ways to stratify data in two dimensions rather than wasting energy to explain why this month's result is different from last month's. It also helps to focus subsequent, more detailed diagnosis on the 20 percent of the process causing 80 percent of the problem. This preliminary work would result in a much less overwhelming cause-and-effect diagram and a more focused subsequent effort than what could be produced in response to the vague question, "What causes accidents?"
Before any action can be taken, the process producing the data must be assessed. As Deming used to say, "Figures on accidents do nothing to reduce the frequency of accidents."
Davis Balestracci is a member of the American Society for Quality and the Association for Quality and Participation. He previously served as chair of the statistics division of ASQ. His book, Quality Improvement: Practical Applications for Medical Group Practice (Center for Research in Ambulatory Health Care Administration, 1994), is in its second edition. Visit his Web site at www.dbharmony.com.