Featured Product
This Week in Quality Digest Live
Management Features
Constance Noonan Hadley
The time has come to check whether the benefits of teamwork still outweigh the costs
Naresh Pandit
Enter the custom recovery plan
Anton Ovchinnikov
In competitive environments, operational innovation could well be the answer to inventory risk
Julie Winkle Giulioni
The old playbook probably won't work
Sarah Schiffling
But supply chains will get worse before they get better

More Features

Management News
Program inspires leaders to consider systems perspective for continuous improvement and innovation
Recent research finds organizations unprepared to manage more complex workforce
Attendees will learn how three top manufacturing companies use quality data to predict and prevent problems, improve efficiency, and reduce costs
More than 40% of directors surveyed cite the ability of companies to execute as one of the biggest threats to improving ESG performance
MIT Sloan study shows that target-independent compensation systems can be superior
Steps that will help you improve and enhance your employee recruitment, retention, and engagement
300 Talent acquisition leaders and HR executives from companies gather in Kansas City
FedEx demonstrates commitment to customer-focused continuous improvement

More News

The QA Pharm


Good Metrics Practice for Quality Management Reviews

Data talk—but are you listening?

Published: Monday, June 13, 2016 - 11:45

Aquality management review of data with responsible company leadership is a current good manufacturing practices requirement. Quality management review procedures vary, but there seems to be a struggle with presenting data from across the quality management system in a meaningful and consistent manner when there are multiple contributors. Following are a few ways to organize your information in a more coherent fashion.

Report an opportunity for improvement. Reporting the opportunity helps to keep the focus on where to improve. For instance, report that 10 percent of investigations were overdue, rather than that 90 percent were completed on time.

A decrease shows improvement. A downward trend means improvement toward zero problems. For example, following root cause analysis training, recurring deviations  decreased from 20 percent to 5 percent in six months.

Compare vs. historical performance. Comparing current performance vs. a previous period helps to illustrate improvement. For example, you may point out that in the current quarter, 95 percent of supplier audits conducted vs. plan was accomplished, compared to 5 percent in the previous quarter.

Index metrics for relative comparisons. Indexing eliminates the effect of arbitrary data sets and helps to make comparisons. For example, there have been seven complaints per billion units manufactured year-to-date vs. 18 complaints for the same period the previous year.

Report absolute numbers for critical issues. Indexing should be avoided when the issue is critical or numbers are low. For example, report that two batches were recalled, rather than 0.2-percent batches were recalled.

Note events with markers on the timeline. When data are reported vs. time, it’s helpful to note significant events that had an effect on these data. For example, indicate that the trend line for environmental monitoring excursions started to increase when building construction started.

Define an unacceptable trend. Trends should be defined for run chart performance data. For example, consider the statistical process control method of five consecutive movements in the same direction, or seven consecutive points on same side of average.

Report measure of variability with averages. When reporting averages, be certain that averages data can be legitimately combined, and provide a measure of variability. For example, reporting an improvement with a decrease in the average number of 17 deviations per batch recorded for the last 10 batches, compared to an average of 25 deviations with the previous 10 batches, is misleading when the range of deviations increased from five to 45 compared to 23 to 28.

Chart scales must be sensitive for the intended purpose. The scale of a chart should be sufficiently large to illustrate the range of normal variation, and small enough to include all excursions within the time frame depicted. For example, the chart scale for percent overdue nonconformance investigations of 0 percent to 100 percent is inappropriate for a 12-month performance chart with normal variation of 3 percent to 6 percent. A more appropriate scale would be 0 percent to 12 percent. If the same time frame included an excursion of 18 percent, then a chart scale of 0 percent to 20 percent would be appropriate.

And always remember: data talk, opinions walk.


About The Author

The QA Pharm’s picture

The QA Pharm

The QA Pharm is a service of John Snyder & Co. Inc., provider of consulting services to FDA-regulated companies to build quality management systems and develop corrective actions that address regulatory compliance observations and communication strategies to protect against enforcement action. John E. Snyder worked at the lab bench, on the management board, and as an observer of the pharmaceutical industry for more than 30 years. His posts on The QA Pharm blog are straight talk about the challenges faced by company management and internal quality professionals. Synder is the author of Murder for Diversion (Jacob Blake Pharma Mystery Series Book 1).