Featured Product
This Week in Quality Digest Live
Management Features
Taran March @ Quality Digest
If at first you don’t succeed, make it a quality problem
Richard Ruiz
Seven ways automation can focus layered process audits on quality improvement rather than administrative workload
Ryan E. Day
Dimensional Engineering uses FARO QuantumS ScanArm for a complex reverse engineering project in petrochemical industry
Janelle Farkas
Data and analysis don’t have to be complicated to yield bottom-line benefits
Theodoros Evgeniou
To leverage AI’s transformational potential, remember that people will determine its context

More Features

Management News
Workers more at ease about job security. Millennials more confident regarding wages.
46% of creative workers want video games in the office
A guide for practitioners and managers
Provides eight operating modes and five alarms
April 25, 2019 workshop focused on hoshin kanri and critical leadership skills related to strategy deployment and A3 thinking
Process concerns technology feasibility, commercial potential, and transition to marketplace
Identifying the 252 needs for workforce development to meet our future is a complex, wicked, and urgent problem
How established companies turn the tables on digital disruptors
Streamlines shop floor processes, manages nonconformance life cycle, supports enterprisewide continuous improvement

More News

Christopher Shoe

Management

Four Enterprise-Level Numbers to Track

Leveraging LPA metrics for meaningful improvement

Published: Monday, October 21, 2019 - 11:03

According to a recent LNS Research survey, 37 percent of quality leaders cite an inability to measure quality metrics as their No. 1 barrier to achieving quality goals. Even worse, the survey showed four in five companies have poor visibility into real-time metrics.

These figures highlight a central problem in quality management: In an era of increasingly large data sets, how can manufacturers leverage these data for meaningful improvement?

It’s a question that’s especially relevant for manufacturers engaged in layered process audits (LPAs), a high-frequency verification strategy where teams conduct short audits every shift. With hundreds or even thousands of audits taking place during the course of a year, making sense of a large volume of data is a core challenge of LPA programs.

While plant managers often track metrics at a granular level, executives need to look at the data a little differently. Let’s look at four of the most important enterprise-level LPA metrics to track.

1. Audit completion and pass rates

Like site-level leaders, management should keep a close eye on audit completion rate and audit pass rates. On an enterprise level, you’ll want to analyze this metric by location to see where audits are happening, which plants are most successful, and which ones need extra help.

This information can give you better visibility into problems and strategic actions needed to improve LPAs, such as identifying plants that are pencil-whipping or rushing through audits. For example, if a plant with lots of defects and complaints also has a high audit-pass rates, it’s a sign that leaders will need to investigate.

On the other hand, you might have certain plant or plants performing well on audit completion and pass rates while also achieving minimal defects. This case represents a potential opportunity to extract best practices from those high performers and apply them across the organization.

But won’t people avoid noting nonconformances to make their audit pass rates look better than they actually are? It’s a definite possibility. To counter this tendency, consider tracking this metric on a high level without publishing it down the pipeline or making it part of a plant’s official targets.

2. Nonconformances by category

Slicing and dicing nonconformance data in different ways is a quick way to identify your biggest trouble areas on an organizational level. Many companies will track audit nonconformances according to categories such as:
• Process area
• Types of LPA questions failed
• Nonconformance type, e.g., tool calibration, process, documentation, or scrap
• Most common corrective actions
• Quality vs. health and safety findings

Viewing these data in Pareto chart format can help you pinpoint where your efforts will have the biggest impact in the organization. Let’s say you have an ambitious goal of reducing failures 25 percent. Using the second y-axis that correlates to cumulative percent, you can easily pick out a few categories to focus on that will get you to your goal.

3. Nonconformance rate

Nonconformance rate is another popular LPA metric that manufacturers are tracking at an enterprise level. In other words, of the number of audits completed, how many actually lead to nonconformances or corrective actions and why?

This metric is important because it gives leaders a sense of how effectively plants are using LPA data. A plant with an abnormally high nonconformance rate indicates a more systemic issue on the plant floor that requires addressing at a deeper level.

4. Predictive analytics

Many organizations are interested in using predictive analytics to improve quality performance. One way they can do that is to look at how metrics like audit completion rate, audit pass rate, and nonconformance rate change over time. By applying time-series analysis to these data, companies can identify trends and potentially even predict what that rate will look like three months into the future.

This analysis is particularly powerful when applied to leading metrics that correlate with other key quality outcomes. For instance, if an uptick in audit nonconformance findings usually precedes customer complaints, you can step in before that happens if you notice nonconformances increasing in any given plant.

Taking it a step further, manufacturers could then use this information to develop poka-yoke tools targeted to specific high-risk processes or steps. At the end of the day, this kind of concrete action is what tracking LPA metrics is all about: using completion rates, pass rates, and nonconformance data to drive targeted quality improvements.

First published Aug. 28, 2019, on the Beacon Quality blog.

Discuss

About The Author

Christopher Shoe’s picture

Christopher Shoe

Christopher Shoe is the Director of Data Science at Ease, Inc. where he is consumed with optimizing data for Beacon Quality customers and maintaining the Beacon Quality Data Warehouse. Shoe holds a Masters in Business Administration and resides in Southern California with his wife and two miniature schnauzers.