Featured Product
This Week in Quality Digest Live
Lean Features
Sarah Burlingame
Coaching can keep management and employees on track
Michaël Bikard
Receiving outsized credit can encourage individuals to work together, even when it results in lower-quality output.
Rachel Gordon
New algorithm is accurate, efficient, and adaptable for complex, real-world assemblies
Sue Via
A lean management road map
Brian Brooks
We talk prevention, but do detection

More Features

Lean News
Quality doesn’t have to sacrifice efficiency
Weighing supply and customer satisfaction
Specifically designed for defense and aerospace CNC machining and manufacturing
From excess inventory and nonvalue work to $2 million in cost savings
Tactics aim to improve job quality and retain a high-performing workforce
Sept. 28–29, 2022, at the MassMutual Center in Springfield, MA
Enables system-level modeling with 2D and 3D visualization, reducing engineering effort, risk, and cost
It is a smart way to eliminate waste and maximize value
Simplified process focuses on the fundamentals every new ERP user needs

More News

Douglas C. Fair


Making the Most of Quality Data

Problems transformed into competitive business advantage

Published: Tuesday, June 27, 2017 - 11:03

Plant-floor quality issues tend to focus on a company’s technical resources. When products fall out of spec, alarms sound and all hands are immediately on deck to fix things. Despite large technology investments to monitor and adjust production processes, manufacturers are still bedeviled by quality problems. The issue is not a lack of technology. It is a lack of quality intelligence.

When problems occur, manufacturers must obviously fix them. But the typical organization expends much more energy reacting to problems rather than preventing them. This is true despite our understanding that, “an ounce of prevention is worth a pound of cure.” We know that proactive measures can be immensely profitable, and yet our limited quality resources spend little time identifying strategic imperatives for avoiding problems. Instead, most of their time is spent responding to issues. Today’s quality professionals are too preoccupied with just fighting the fires that rage on shop floors.

Quality and the big picture

The most successful, forward-looking and competitive companies I work with focus on proactively preventing problems. How? By taking a holistic view of quality. They regularly step back to summarize and analyze large amounts of quality data. Stepping back gets them away from the fires, and out of the routine of fixing issues.

Imagine aggregating all of your quality data for the last month across all products and production lines. Doing so would allow you to see the nuanced quality differences between regions and plants. It would tell you where systemic issues need to be addressed and help prioritize improvement efforts. In other words, aggregating data allows you to see the big quality picture.

Today’s manufacturing plants make a dizzying variety of products. So you may be wondering how wholesale information can be extracted from vastly different parts, material types, and specification limits. To summarize information across different parts, data normalization can be used to allow fair comparisons even between disparate items. Today it is just a mathematical exercise; easy to perform with software, making data unification and summarization a reality.

Stop “storing and ignoring”

When critical features fall out-of-spec, alarms blare and support personnel descend on the shop floor and get the issues fixed. After completing their tasks, they quickly move on to the next daily priority or fire drill. In this case, at least the alarm data were used for solving the problem.

But what happens to data that triggers no alarms? What about data that meets specification limits? Most will say that if data is in-spec, then it is good enough. And that is the problem. When data is considered “good enough” it is just stored in a database, rarely to be seen again. The error here is assuming that since the data didn’t trigger an alarm, it contains no useful information. If data is not reviewed or analyzed, then expect to be blind to the information it contains. The truth is that value exists in any data you collect. Otherwise, it shouldn’t be collected. 

When companies ignore in-spec data, they are throwing away enormously useful information. Many companies I have worked with have turned orphaned data into gold by extracting previously unknown information from it. It was unknown simply because they considered the data to be “good enough.” As a result, they were blind to the information the data contained. These experiences have me conclude that the greatest potential for modern quality improvement comes from aggregating and analyzing data that actually falls within specifications.

Seem odd? Not to me. Think of how frequently parts don’t meet specs. It’s rare. That means that very few data values are viewed for problem-solving purposes. And if those few data values receive the lion’s share of attention, what happens to the huge amount of data where no problems exist? They are stored and ignored.

And it’s getting worse. Because modern technologies support automated data collection, far more data is currently being gathered than in years past. This means that the amount of data being ignored is increasing. It’s staggering how much data is available and yet how little of it is ever viewed.

The reality is that companies rarely go back and look at data that is in spec. Yet, there is rich, valuable information hidden in those overlooked records. Imagine being an operations director who oversees 50 plants. If you could roll up all of your critical quality data across those locations, you would immediately have a holistic view of your manufacturing operations. You could identify which regions are the best performers. You could highlight the plants and production lines with the highest quality costs. You could pinpoint where defect levels could be reduced and which plants require attention to minimize the probability of recalls. And your company could become more competitive as a result.

Rather than simply reacting to quality problems, manufacturers need to direct their attention and time to proactively attacking quality. How? By regularly evaluating the massive amount of overlooked data that they already have.

Data aggregation through cloud technology

Traditional on-premises software solutions aren’t great for deploying across an enterprise. But cloud-based quality software platforms are. Since cloud-based solutions are securely hosted by vendors who monitor and maintain system infrastructure, the need for on-site IT support is minimized and capital costs are greatly reduced. The nature of cloud-based systems makes large-scale, multi-plant deployments fast, easy, and inexpensive, ensuring benefits are enjoyed sooner rather than later.

Plus, cloud-based systems connect manufacturing sites across the internet, support standardization, and store quality data from multiple plants in a centralized database. Because data is stored in one place, quality professionals, engineers, managers, and others can easily view the big picture of quality. A single data repository is ideal for supporting corporatewide quality strategies and initiatives.

Cloud-based quality systems should use simple web browsers, empowering quality professionals to break through geographical, cultural, and infrastructural barriers to connect facilities around the world—and provide data aggregation capabilities that can unlock critical information for driving quality improvements on a large scale.

The capability is here and the technology is inexpensive. So what keeps quality professionals from enjoying enterprisewide cost and defect reduction? It’s those fires you keep fighting every day. Don’t just snuff them out—prevent them in the first place and use the time savings to re-imagine how quality can transform your organization’s performance.



About The Author

Douglas C. Fair’s picture

Douglas C. Fair

A quality professional with 30 years’ experience in manufacturing, analytics, and statistical applications, Douglas C. Fair serves as chief operating officer for InfinityQS. Fair’s career began at Boeing Aerospace, and he worked as a quality systems consultant before joining InfinityQS in 1997. Fair earned a bachelor’s degree in industrial statistics from the University of Tennessee, and a Six Sigma Black Belt from the University of Wisconsin. He’s a regular contributor to various quality magazines and has co-authored two books on industrial statistics: Innovative Control Charting (ASQ Quality Press, 1998), and Quality Management in Health Care (Jones and Bartlett Publishing, 2004).