That’s fake news. Real news COSTS. Please turn off your ad blocker for our web site.
Our PROMISE: Our ads will never cover up content.
Steve Daum
Published: Sunday, June 14, 2009 - 22:00
With several generations of statistical process control (SPC) technology under our belts, it may be time to rethink how we apply SPC in the 21st century. Basic techniques have been practiced since the 1930s. Some companies will soon be able to say, “we’ve been practicing SPC for 100 years.” Since the time Walter Shewhart first proposed the techniques, they have been widely deployed.
Over the years, there have been improvements in how SPC is used. Some of this can be attributed to technological changes. When personal computers and software arrived, the tedium of manual calculations was reduced. When databases came into the picture, it became easier to organize and find data gathered for SPC. When the Internet arrived, it became easier to share and publish SPC information.
Despite the improvements, our current approach to SPC is ripe for an overhaul. A combination of technology improvements, organizational changes, and a more systems-based mindset among companies has set the stage for the next leap forward.
Before thinking about that leap, it is instructive to consider how SPC usage has evolved. While none of this information may be new, it is important to view the sometimes small changes to understand the “big picture” of statistical process control.
When control charts were introduced, humans did all the work manually, using pencil and paper. The process began by identifying an important quality metric. Next, data was gathered from production using a sampling plan. For example, three pieces were measured every 30 minutes with the values dutifully recorded on paper. When enough data had been gathered, a set of control limits was calculated, by hand. The limits were transferred onto graph paper and the data points were plotted. Out-of-control points were studied, assignable cause variation was removed, and limits may have been recalculated. Eventually, a chart was posted and samples were plotted directly on the chart. As new samples were plotted, operators were taught to look for out-of-control conditions and trends.
In the early 1980s, personal computers and software became popular. Once companies saw the benefits of word processing and spreadsheet software, they were quick to find software to help in their SPC work. The early SPC programs all shared several traits. They allowed the user to enter, save, and retrieve data from quality metrics. They did the basic statistical calculations for the user. They displayed data charts automatically, reducing the need for graph paper and manual charting. These programs tended to store data in a proprietary way that was convenient and efficient for the program to manipulate.
In the 1990s, computers were tied into local area networks (LANs). Computers also became “easier to use” because of graphical user interfaces, mice, and more standardized software. Database programs became more widespread and were used as consistent data storage for various applications. These trends impacted the newer versions of SPC software. They were developed with graphical interfaces and tended to store data in a database rather than as proprietary files. Additionally, data could now be shared across networks, allowing access to users on multiple computers.
Human must know…
Human must do…
Problems…
1 ST generation SPC
Basic math, computations, chart plotting, SPC, chart interpretation.
Plot charts by hand.
Interpret chart.
Act on interpretation.
Error prone.
Data and charts are only local.
Reviewing, storing history of charts is difficult. Number of metrics limited.
2 nd generation SPC
Computer skills, data entry skills, SPC, chart interpretation.
Computer data entry.
Interpret chart.
Act on interpretation.
Data captive to program. Data is easily lost. Data and charts are only local. More analytic choices—misapplication. Number of metrics limited
3 rd generation SPC
GUI skills, basic database knowledge, LAN knowledge, data entry skills, SPC, chart interpretation.
Computer data entry.
Interpret chart.
Act on interpretation.
More and more analytic choices; confusion. More metrics possible, still human intensive for interpretation.
Summarizing the first three generations of SPC
You might notice that the human role has not changed. The tools may be different and the calculations more accurate, but the basic workflow has not changed. First-generation computer systems often mimic, almost to a fault, the manual systems they are designed to replace. Rather than rethink the problem, it is easier to re-use the original thinking and create a computer solution that does the same thing.
Here's a question: How many quality metrics could a third-generation SPC user keep track of, compared to an SPC user trained by Shewhart himself? This is obviously rhetorical, and today’s SPC user probably can track more metrics—but it's not a huge difference. Today’s SPC systems still require significant human interaction, and the number of metrics we can track is not nearly as large as the number we might like to track given the resources.
Now, before the Luddite mode sets in, let me say this: the goal is not to take away jobs. The goal is to eliminate mundane, repetitive jobs so these workers can concentrate on more value-added work like maintaining customer satisfaction and loyalty.
You won’t find a definition for fourth-generation SPC on the National Institute Standards and Technology (NIST) web site. If you Google long enough, it will probably show up somewhere. There is nothing official about this; the name is just a way to think about and discuss SPC's next leap forward.
Let’s start by describing some things in today’s computing environment. A key fact is that data are flowing into databases. This is happening on an unprecedented scale. Data are flowing from several sources. It comes from enterprise resource planning (ERP) systems such as SAP. It comes from plant automation systems such as those from Rockwell Automation. It comes from laboratory information management systems (LIMS) systems such as Labworks. It also comes from various systems developed in-house. The data might be collected by automated systems or from users of enterprise- or web-based applications; the source is unimportant, but the fact that this data lands in a relational database is quite useful. Much of this data ends up in Microsoft SQL Server, Oracle, IBM, or mySQL databases.
Data in relational databases is accessible. Accessibility can be configured locally, plantwide, intranetwide, or even worldwide via the Internet. There are many database query tools available and many applications can ask for data. This data can be used for reporting, analysis, decision making, and yes… even for SPC.
Another factor in today’s computing environment is that people are connected and can communicate instantly. E-mail, text messaging, social networking, and the Internet in general have greatly enhanced employee communication. This creates new opportunities for quality improvement across previously insurmountable barriers.
Any self-respecting fourth-generation SPC system will take these factors into account. Rather than mimic the paper-based systems of the past, fourth-generation SPC should expand SPC benefits by an order of magnitude.
In a fourth-generation SPC system, analysis is separated from data collection and data entry. Today, to know if a point is out-of-control on an X-bar chart, a common approach is to start by entering or importing the data into an application that can make X-bar charts, and then looking at the resulting chart. Imagine that this data originated from a weight sensor in a plant and was stored in a database. Why not leave the data in the database and just think about it with SPC software?
We pay attention to metrics because we want to quickly know when a quality issue develops. In today’s SPC paradigm, we check a chart at some regular interval—say hourly, daily, or weekly. Often, the chart will show no signal. In other words, everything is all right. That is great news, but a few minutes of valuable time were used for no reason. Imagine if the system could just tap you on the shoulder and say, “Hey… you might want to check on the chart for weight.” It would do this only when a significant signal was detected. Now, rather than being able to track, say, 10 charts, you could potentially be kept aware of hundreds or thousands. The main feature is that the task of interpreting charts for out-of-control conditions has been delegated to the computer. In previous SPC systems, this has always been a human's job.
A fourth-generation SPC system should also communicate with you. It should do this even when you are not near the computer or actively running a specific program. After all, the computer knows when new data arrives in the database. It also knows how to analyze the data. So when it discovers something important, it should be able to tell you. Most people carry cell phones and use e-mail, so these are logical channels for the fourth-generation SPC system to use.
Much of the technology needed to create fourth-generation SPC exists. Clever integrators are starting to deploy these systems. Is it time to rethink your approach to SPC?
We have been applying SPC using a similar approach for a long time. This approach limits the number of quality metrics being tracked effectively because it is human resource intense. It’s time for a new and innovative approach to SPC—a fourth-generation approach. This approach should greatly expand the number of quality metrics we track. It should do this without significantly increasing costs and it should leverage today’s database and instant communications technology. Fourth-generation SPC should create an accelerating cycle of improving quality and reducing costs.
Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, M. Stephen Daum is director of development for PQ Systems. Prior to assuming responsibility for development, Daum was the lead programmer on PQ’s statistical software products, a position he took in 1985. Daum has more than 20 years of experience with control charts and control charting software and has shared that experience through presentations, training, and educational sessions for organizations throughout the United States, England, and South Africa.It’s Time for a New and Innovative Approach to SPC
Fourth-generation SPC will create an accelerating cycle of improved quality and reduced costs.
1st Generation SPC
2nd Generation SPC
3rd Generation SPC
4th generation SPC
Features of 4th generation SPC
Summary
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Steve Daum
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.