Featured Product
This Week in Quality Digest Live
Health Care Features
Jón Bergsteinsson
Understanding the standard is essential
Rob Moorey
Efficient processes and technology are key
Stephanie Ojeda
The FDA’s new QMSR will harmonize with ISO 13485 for medical device quality management
Steve Thompson
An excellent technological tool that improves quality and compliance
Delivering quality to the health industry

More Features

Health Care News
Streamlines annual regulatory review for life sciences
The company is also facilitating donations to the cause
Mass spectromic analysis from iotaSciences
Showcasing the latest in digital transformation for validation professionals in life sciences
An expansion of its medical-device cybersecurity solution as independent services to all health systems
Purchase combines goals and complementary capabilities
Better compliance, outbreak forecasting, and prediction of pathogens such as listeria or salmonella
Links ZEISS research and capabilities in automated, high-resolution 3D imaging and analysis
Creates one of the most comprehensive regulatory SaaS platforms for the industry

More News

William A. Levinson

Health Care

Are Control Charts Suitable for Health Care Applications?

Adverse events require closed-loop corrective action to prevent recurrence

Published: Monday, December 12, 2011 - 10:54

Hospital-acquired infections, ventilator-acquired pneumonia, patient falls, and similar events are (hopefully) rare enough to promote discussion of control charts for rare events. A Google search will, for example, turn up the application of u charts to falls per 1,000 patient days (u being defect density, such as defects per unit or defects per time period).

If the undesirable occurrence is sufficiently rare, though, the normal approximation to the Poisson distribution no longer applies, so the traditional u chart will not work. I advocate use of the exact discrete distribution to eliminate reliance on the normal approximation, but even this is not adequate for extremely rare events. Charts whose foundation is the geometric or exponential distribution, and whose metric is units or time respectively between undesirable events, are more applicable in those cases. In “Working with Rare Events,” Donald J. Wheeler compares the XmR chart, g chart (counts between events), and t chart (time between events, not to be confused with the multivariate T chart or a chart for the t statistic).

The statistical technology therefore exists to monitor rare events, but is this technology the right solution to the wrong problem? The entire purpose of a control chart is to distinguish between random or common cause variation and special or assignable cause variation. If, for example, a dimension’s nominal is 1.0000 in., the tool may deliver 0.9998 in. on one part and 1.0001 in. on the next due to unavoidable but acceptable (depending on the width of the specification) random variation.

However, a strong argument can be made that almost every nosocomial (originating in a hospital) complication is due to a special or assignable cause and therefore requires investigation and closed-loop corrective action. People do not fall, get infections, and so on because of unavoidable random variation. Even if the occurrence rate is very low indeed, each incident is 100-percent poor quality to the patient involved. This poor quality carries a 9 or 10 severity rating for failure mode and effects analysis (FMEA) purposes because these ratings are reserved for events that jeopardize human life or safety.

There was a time when the medical profession attributed hospital-acquired infections to “miasms” in the air, which was little better than attributing them to evil spirits. Infections were therefore purportedly random arrivals about which nothing could be done. Joseph Lister then proved that antiseptic conditions eliminated most of these infections by killing the bacteria that cause them. Even though this knowledge has been available for more than 130 years, hand-washing compliance is still only about 50 percent in most hospitals. Hygreen has developed an automatic reminder system to ensure that hospital workers wash their hands between patients. This application of Henry Ford’s “can't rather than don’t” safety principle—i.e., the worker cannot perform an unsafe action, as opposed to having to remember to remember not to do it—reduced nosocomial infections by Miami Children’s Hospital. The purported common cause variation therefore turned out to be a very assignable cause whose removal eliminated most of the trouble.

Patient falls also are not random events that are suitable for tracking on control charts. Investigation will show that most have a special or assignable cause:

Patient falls are among the top concerns of hospitals today. Stryker’s patented Chaperone bed exit system accurately senses body positioning by constantly tracking the patient’s center of gravity. By basing its alert system on the patient’s weight distribution, Chaperone is an effective, reliable tool to help prevent falls and reduce false alarms.

This is another application of the Ford can’t-rather-than-don’t safety principle. In addition, the Stryker GoBed II can be lowered to 14.5 in. The lockout-tagout concept of zero mechanical potential, or at least minimum mechanical potential, comes to mind immediately. If the patient somehow manages to fall out despite the safety features, he is far less likely to suffer serious or even minor injury.

The Institute for Healthcare Improvement (IHI) meanwhile offers “bundles” or sets of best practices that enormously reduce central line infections and ventilator-acquired pneumonia. IHI adds that ventilator-acquired pneumonia adds $40,000 to the cost of a typical hospital stay, and raises mortality from 32 to 46 percent for ventilator patients.

All this suggests that control charts are generally not suitable controls for medical complications, most of which have special or assignable causes that are removable by closed-loop corrective action. They are obviously not suitable for medication errors, which are assignable cause by definition.

With the exception of willful failure to follow established procedures—and by this we mean deliberate circumvention of error-proofing systems like those described above, as opposed to, for example, forgetting to wash one’s hands in the absence of such a system—“assignable” means a problem with the system in which the people must work, as opposed to an individual who makes a mistake. The quality sciences recognize that, if it is possible to forget to wash one’s hands, even the most diligent people will eventually do so in the course of hundreds or thousands of patient interactions every year. If people are afraid they might be disciplined for reporting that they almost made an error—the Japanese call this a hiyari or scare report—or even that they did make an error, the deficiency that allowed the error to happen will never be corrected. A near-mistake or mistake that is reported will happen only once if there is prompt closed-loop corrective action, while one that people cover up will recur indefinitely.

Instructions to “be more careful,” (i.e., “don’t rather than can’t”) do not qualify as closed-loop corrective action. If it is possible to pick up a vial of high-concentration heparin instead of low-concentration heparin, that will also happen eventually and create a 9 or 10 severity incident. This kind of accident can incidentally be made almost impossible; Baxter, a manufacturer of heparin, added a special label that has to be torn off before the high-concentration vial can be used.

Pharmacists of the 19th century used bottles of unusual shape, or with rough surfaces, for particularly dangerous medications. A publication in 1899 cited a proposed New York “poison bottle bill” that required the sale of all such liquids in octagonal bottles. England’s House of Lords considered similar legislation as long ago as 1863; it would have required poisons to be sold only in hexagonal bottles while prohibiting the sale of any nonpoisonous liquid in such a container. Lord Salisbury argued against this legislation on the grounds that it should not be necessary to protect sensible Britons from the consequences of not striking a light to read a bottle’s label before ingesting its contents at night, although less enlightened foreign countries felt it necessary to protect people from the consequences of their own acts. The truth is, of course, that if millions of people take medications, somebody is sooner or later going to pick up the wrong bottle unless the bottle makes it impossible to do so.

The prevention of medication errors offers enormous savings of both human life and money, noting that the average cost of an adverse drug event is $8,750. (This average is almost certainly from a highly skewed distribution that includes a majority of much lower costs along with malpractice awards that exceed half a million dollars in egregious cases.) The reference adds that 4 out of every 10 medical errors involve medication mismanagement. These also are assignable or special cause problems that are largely preventable with a can’t-rather-than-don’t approach.

In summary, the application of control charts to medical “defects” that include adverse drug events and hospital-acquired infections risk routine treatment of assignable or special cause incidents as random or common cause events. The severity, and in most cases the possibility of prevention, of these events demand that each event be treated as a special cause incident that requires closed-loop corrective action to prevent recurrence.


About The Author

William A. Levinson’s picture

William A. Levinson

William A. Levinson, P.E., FASQ, CQE, CMQOE, is the principal of Levinson Productivity Systems P.C. and the author of the book The Expanded and Annotated My Life and Work: Henry Ford’s Universal Code for World-Class Success (Productivity Press, 2013).