You Can’t Improve a Process If You Can’t Measure It

Putting continuous improvement and precision measurement into everyone's hands

Dirk Dusharme @ Quality Digest

January 13, 2022

No matter which quality management methodology, technique, or fad du jour you chose during the past 40 years, from quality circles to TQM to Six Sigma, all had one thing in common: data. In manufacturing this eventually meant measurement data. Whether it was dimensional, time, temperature, frequency, pressure, or some other metric, somewhere in manufacturing somebody used measurement equipment to verify the quality characteristics of the part being manufactured. This person was typically a specialist either by profession or experience.

This specialization prevented the goal of modern quality, which is that quality is everyone’s job. How can a manufacturing employee be part of quality if they can’t measure their own work or verify the parts they’re assembling? And yet, that’s the way it’s been since the beginning of the Industrial Revolution. Assemblers assemble and testers test. Yes, eventually line workers got go/no go gauges, snap gauges, and some simple dimensional tools, but assembly complexity grew faster than the ability of the tools used by the average worker. It’s like giving a carpenter a stack of precut lumber, a box of nails, plans for a house… but no measure.

The tech problem

Back in 1982 when Quality Circle Digest (later Quality Digest) was first published, the world was only just beginning to see the growth of personal computing and the integration of computer systems into test equipment. User interfaces for any kind of manufacturing equipment, including test equipment, were analog, cumbersome, limited in functionality, and took a skilled technician to use. The goal of early equipment was simply to display measurement results to the operator, who would then record, with pencil and paper, the readings. Often the technician was a data collector/logger only. Interpreting the data was left to the guys—almost always guys—with bow ties, pocket protectors, and slide rules (look it up... I still have my dual-base log/log Pickett). The idea that a measurement device could, with a push of a button, take a measurement, perform temperature compensation, apply correction factors (dimensional or otherwise), collect the data, store it, analyze it, interpret it, display the results to an operator, and send corrections to the machine that created the part in the first place, and do it all on a computer the size of a National Geographic, was the stuff of dreams... and probably not even that.

Even early coordinate measuring machines (CMMs), as amazing as they were, were primitive. Pre-electronic display models simply had analog scales for each axis. Later models replaced analog scales with electronic readouts. Innovation emphasis was on stability and increasing accuracy and repeatability. Competitive advantage went to those whose CMMs were more accurate or faster. The operator was almost an afterthought.

The same was true with simple handheld measurement tools such as micrometers or calipers. You read the measurement from a Vernier scale and noted the data in your engineering notebook—that’s a paper notebook for those of you born after 2000—with a Ticonderoga No. 2 pencil... or a pen if you were cocky.

So quality control involved simple tools on the line and more complex equipment for the test techs. Monitoring and improvement was left to a select few, making “quality is everyone’s job” more a slogan than reality. The technology didn’t support it. The skill and time involved in doing 100-percent inspection measurements on the fly, in production, precluded pushing the measurement portion of the quality function to the shop floor in a meaningful way.

Thanks, Messrs. Jobs and Gates

By the mid-1980s, the personal computer had changed the entire face of precision production measurement and, by extension, quality management. Although there are plenty of great examples, let’s just look at 3D measurement. The late 1980s introduced two hardware technologies and at least one software initiative that pushed industrial 3D measurement to the forefront: the invention of the laser tracker in 1987 by Kim Lau, CEO of API but who was then at NIST; and the ROMER SARL portable articulated arm by Homer Eaton and Romain Granger in 1986. Actually, Eaton had patented an earlier version of the arm in the 1970s, but it required a 40 lb computer with a Teletype as an input device to run it and was tailored specifically for tube measurement. FARO released a hinged arm in 1984 and an articulated arm in 1995.

With the proliferation of 3D metrology technology came the software that ran each brand’s equipment, which was fine until you needed to share data between equipment. So in the mid 80s, the Consortium for Advanced Manufacturing International created the dimensional measuring interface standard (DMIS) to enable 3D measuring equipment (CMMs initially) to communicate seamlessly with each other. Traditionally, each CMM manufacturer developed its own programming language and embedded it within its measuring-application software, which meant that CMMs (sometimes even those from the same vendor) were unable to execute each other's inspection part programs. DMIS was one step toward today’s connected factory.

The same era brought us laser scanning, structured light scanning, laser radar, and digital photogrammetry. Although those technologies, especially photogrammetry, had been around for ages, personal computers brought them out of the lab and democratized the technologies by making them more simple, faster, and most important, ready for the shop floor and eventually the assembly line. Portable 3D scanning in particular, with its thousands, then hundreds of thousands, then millions of points per second data acquisition would have been impossible without the PC.

The same advances were happening with simple hand tools. Micrometers and calipers, for instance, were fitted with RS-232 or proprietary serial interfaces that allowed them to connect directly to a PC. This enabled hands-free—and error-free—data entry. Line personnel could easily collect data into a PC, which could give go/no go results, show trends, or even control charts on the fly.

But with all the hardware and software advances, we still hadn’t arrived at dovetailing technology and shop-floor work into something that gave the average shop-floor worker the tools they could use to help them truly be part of the quality process. At some point the focus had to shift from technology to more simplified operator interfaces. For quality to be everyone’s responsibility, the measurement tools of quality would either have to be usable by anyone, even relatively unskilled workers, or be able to work alongside them where the work was being done.

The forgotten human remembered

From QD’s perspective, during the 1980s and 1990s most of the emphasis on stationary CMMs and portable CMMs—trackers, scanners, and the like—was on improving these technologies by making them more accurate, faster, and more environmentally robust. The same was true of the software that ran them. How fast could it process data? How well did it compare measurement data to CAD? What CAD formats did it support? Could it communicate with other software or hardware? Hardware and software still required a fair degree of training, again keeping the inspection process in the hands of the few.

But as these technologies matured, they also became commoditized. Specsmanship became more difficult because the specs really weren’t all that different between competing companies with the same tech. The reason you bought from Vendor A as opposed to Vendor B was due more to salesmanship, relationship, brand preference, and maybe because one brand had some feature you valued—likely having to do more with usability—than anything else.

Measurement-tool commoditization and the continuing push to move measurement and data gathering to the shop floor, coupled with faster-and-faster and smaller-and-smaller computers, has led to a more intense look at the user experience and user interface as a selling point.

And thus we entered the era of user experience/user interface (UX/UI). The operator has at last come to the forefront. In the past 15 years or so, developers began asking themselves “how can we make our equipment easier to learn and operate?” Ergonomics, task hand-offs, ease of use, portability, and training started to become as important as how many millions of points of data per second could be collected. Today, almost every large metrology company has UX/UI staff or entire UX/UI departments.

The result is that now almost any worker can perform complex measurements and get meaningful results on the work they’re performing. If not the assembly people themselves, then at least someone near or in the cell at that step of the assembly process. The loop between worker task and task feedback is quickly closing.

Are you looking at me?

Simultaneous with the focus on the user experience has been the effort to remove the user from certain inspection and reporting operations altogether. Machine learning, artificial intelligence, digital twins, and other technology, combined with the connected factory, means that some of the decision-making that might previously have been made by shop floor personnel, who often have limited knowledge of the entire process, is now being handled automatically. AI is allowing us to look at and evaluate entire processes, even those that involve manual tasks that aren’t collecting electronic data.

We recently saw an example of AI spotting errors in a manual assembly process. This particular system uses AI to monitor video feeds that watch humans at work. By observing both product and human motion, it can detect when a human fails at some task. But what’s unique is that it can also look upstream in the process, back in time, and see if there was something that caused the downstream human to miss or delay a task. In the actual example we witnessed, a part got placed on a conveyor with a component in the wrong place. A worker downstream was delayed by having to adjust to the upstream error. This type of cascading problem happens a lot on assembly lines and can be difficult to quickly identify based on human reporting, but it’s a perfect job for AI.

We are also seeing more assembly-line robots fitted with 3D scanners and other types of automated inspection. The closer the inspection is to the work being done, the faster feedback can be given to workers on the work they are performing.


We are approaching a point where management—we hope—and technology are enabling every worker to be part of quality and continuous improvement. The importance of this is twofold. Obviously, the earlier you spot problems the better. So getting inspection as close to the work as possible is good for the company.

But more important is that right now, even with all of the automation on the plant floor, it’s still people who do the majority of the work. And most people want to do a good job. Most people want the fruit of their labor to be something they can be proud of. Not being able to do a good job because you lack tools and feedback is demoralizing. Having mistakes originate from your station or line is even more demoralizing if you have no control over the mistakes due to lack of knowledge.

Giving workers the ability to easily measure their work and immediately see and respond to workmanship or process errors as they happen allows them to be part of the quality process. Further, if they consistently see errors reported by test equipment in real time, it helps them to identify the inherent weaknesses in the process that otherwise would be invisible because offline inspection is too far removed from their task. If, in conjunction with this, management looks at machine inspection and reporting as a means to spot process errors and then uses people who do the work to suggest process improvements, then quality will have truly become everyone’s job.

About The Author

Dirk Dusharme @ Quality Digest’s picture

Dirk Dusharme @ Quality Digest

Dirk Dusharme is Quality Digest’s editor in chief.