With several generations of statistical process control (SPC) technology under our belts, it may be time to rethink how we apply SPC in the 21st century. Basic techniques have been practiced since the 1930s. Some companies will soon be able to say, “we’ve been practicing SPC for 100 years.” Since the time Walter Shewhart first proposed the techniques, they have been widely deployed.
Over the years, there have been improvements in how SPC is used. Some of this can be attributed to technological changes. When personal computers and software arrived, the tedium of manual calculations was reduced. When databases came into the picture, it became easier to organize and find data gathered for SPC. When the Internet arrived, it became easier to share and publish SPC information.
Despite the improvements, our current approach to SPC is ripe for an overhaul. A combination of technology improvements, organizational changes, and a more systems-based mindset among companies has set the stage for the next leap forward.
Before thinking about that leap, it is instructive to consider how SPC usage has evolved. While none of this information may be new, it is important to view the sometimes small changes to understand the “big picture” of statistical process control.
1st Generation SPC
When control charts were introduced, humans did all the work manually, using pencil and paper. The process began by identifying an important quality metric. Next, data was gathered from production using a sampling plan. For example, three pieces were measured every 30 minutes with the values dutifully recorded on paper. When enough data had been gathered, a set of control limits was calculated, by hand. The limits were transferred onto graph paper and the data points were plotted. Out-of-control points were studied, assignable cause variation was removed, and limits may have been recalculated. Eventually, a chart was posted and samples were plotted directly on the chart. As new samples were plotted, operators were taught to look for out-of-control conditions and trends.
- What humans had to know: Basic math skills, basic understanding of variation, basic SPC knowledge, how to use a formula to calculate values, SPC chart interpretation.
- What humans had to do: Record data on paper; make calculations on paper, plot data on graph paper, interpret control charts for out-of-control conditions, and make process decisions based on chart interpretations.
- Challenges: Pencil and paper data entry and calculations were tedious and error-prone. The chart and data were only locally available and only by direct process workers. The quality of chart interpretation varied based on the training and skills of those involved. The history of control chart activity, which could also be described as the “previous learning” of a control chart, was not easily stored and reviewed—except by someone who likes file cabinets. The time-consuming nature of SPC charting put a limit on the number of metrics that could be monitored.
2nd Generation SPC
In the early 1980s, personal computers and software became popular. Once companies saw the benefits of word processing and spreadsheet software, they were quick to find software to help in their SPC work. The early SPC programs all shared several traits. They allowed the user to enter, save, and retrieve data from quality metrics. They did the basic statistical calculations for the user. They displayed data charts automatically, reducing the need for graph paper and manual charting. These programs tended to store data in a proprietary way that was convenient and efficient for the program to manipulate.
- What humans had to know: Basic computer skills, basic understanding of variation, basic SPC knowledge, SPC chart interpretation. Also, it wasn’t a bad idea to know how to back up data files.
- What humans had to do: Record data in the computer; interpret control charts for out-of-control conditions, and make process decisions based on chart interpretation.
- Challenges: The data gathered for SPC tends to be captive to the program where it's entered. Since some SPC software was difficult to use, incorrect application of SPC ensued. The software could do more than a person with paper and pencil, leading to a proliferation of new analytical choices that might be made incorrectly. Data files were easily lost or deleted because backup systems were not yet widely deployed. The quality of chart interpretation varies, based on the training and skills of those involved and on how correctly the software and charts have been configured. Data and charts are still primarily used and available only locally—where the data originates. The time consuming nature of SPC charting puts a limit on the number of metrics that can be monitored.
3rd Generation SPC
In the 1990s, computers were tied into local area networks (LANs). Computers also became “easier to use” because of graphical user interfaces, mice, and more standardized software. Database programs became more widespread and were used as consistent data storage for various applications. These trends impacted the newer versions of SPC software. They were developed with graphical interfaces and tended to store data in a database rather than as proprietary files. Additionally, data could now be shared across networks, allowing access to users on multiple computers.
- What humans had to know: Basic computer skills, understanding a computer network, basic understanding of variation, basic SPC knowledge, SPC chart interpretation, graphical user interfaces and basic database technology.
- What humans had to do: Record data in the computer; interpret control charts for out-of-control conditions, and make process decisions based on chart interpretation.
- Challenges: The increasing number of choices available to the user in the software lead to problems of SPC misapplication and general confusion among software users. The quality of the chart interpretation varied, based on the training and skills of those involved and on how correctly the software and charts had been configured. More metrics could be tracked, but there was still a human bandwidth issue.
|
Human must know… |
Human must do… |
Problems… |
1 ST generation SPC |
|||
2 nd generation SPC |
|||
3 rd generation SPC |
. |
You might notice that the human role has not changed. The tools may be different and the calculations more accurate, but the basic workflow has not changed. First-generation computer systems often mimic, almost to a fault, the manual systems they are designed to replace. Rather than rethink the problem, it is easier to re-use the original thinking and create a computer solution that does the same thing.
Here's a question: How many quality metrics could a third-generation SPC user keep track of, compared to an SPC user trained by Shewhart himself? This is obviously rhetorical, and today’s SPC user probably can track more metrics—but it's not a huge difference. Today’s SPC systems still require significant human interaction, and the number of metrics we can track is not nearly as large as the number we might like to track given the resources.
Now, before the Luddite mode sets in, let me say this: the goal is not to take away jobs. The goal is to eliminate mundane, repetitive jobs so these workers can concentrate on more value-added work like maintaining customer satisfaction and loyalty.
4th generation SPC
You won’t find a definition for fourth-generation SPC on the National Institute Standards and Technology (NIST) web site. If you Google long enough, it will probably show up somewhere. There is nothing official about this; the name is just a way to think about and discuss SPC's next leap forward.
Let’s start by describing some things in today’s computing environment. A key fact is that data are flowing into databases. This is happening on an unprecedented scale. Data are flowing from several sources. It comes from enterprise resource planning (ERP) systems such as SAP. It comes from plant automation systems such as those from Rockwell Automation. It comes from laboratory information management systems (LIMS) systems such as Labworks. It also comes from various systems developed in-house. The data might be collected by automated systems or from users of enterprise- or web-based applications; the source is unimportant, but the fact that this data lands in a relational database is quite useful. Much of this data ends up in Microsoft SQL Server, Oracle, IBM, or mySQL databases.
Data in relational databases is accessible. Accessibility can be configured locally, plantwide, intranetwide, or even worldwide via the Internet. There are many database query tools available and many applications can ask for data. This data can be used for reporting, analysis, decision making, and yes… even for SPC.
Another factor in today’s computing environment is that people are connected and can communicate instantly. E-mail, text messaging, social networking, and the Internet in general have greatly enhanced employee communication. This creates new opportunities for quality improvement across previously insurmountable barriers.
Any self-respecting fourth-generation SPC system will take these factors into account. Rather than mimic the paper-based systems of the past, fourth-generation SPC should expand SPC benefits by an order of magnitude.
Features of 4th generation SPC
In a fourth-generation SPC system, analysis is separated from data collection and data entry. Today, to know if a point is out-of-control on an X-bar chart, a common approach is to start by entering or importing the data into an application that can make X-bar charts, and then looking at the resulting chart. Imagine that this data originated from a weight sensor in a plant and was stored in a database. Why not leave the data in the database and just think about it with SPC software?
We pay attention to metrics because we want to quickly know when a quality issue develops. In today’s SPC paradigm, we check a chart at some regular interval—say hourly, daily, or weekly. Often, the chart will show no signal. In other words, everything is all right. That is great news, but a few minutes of valuable time were used for no reason. Imagine if the system could just tap you on the shoulder and say, “Hey… you might want to check on the chart for weight.” It would do this only when a significant signal was detected. Now, rather than being able to track, say, 10 charts, you could potentially be kept aware of hundreds or thousands. The main feature is that the task of interpreting charts for out-of-control conditions has been delegated to the computer. In previous SPC systems, this has always been a human's job.
A fourth-generation SPC system should also communicate with you. It should do this even when you are not near the computer or actively running a specific program. After all, the computer knows when new data arrives in the database. It also knows how to analyze the data. So when it discovers something important, it should be able to tell you. Most people carry cell phones and use e-mail, so these are logical channels for the fourth-generation SPC system to use.
Much of the technology needed to create fourth-generation SPC exists. Clever integrators are starting to deploy these systems. Is it time to rethink your approach to SPC?
Summary
We have been applying SPC using a similar approach for a long time. This approach limits the number of quality metrics being tracked effectively because it is human resource intense. It’s time for a new and innovative approach to SPC—a fourth-generation approach. This approach should greatly expand the number of quality metrics we track. It should do this without significantly increasing costs and it should leverage today’s database and instant communications technology. Fourth-generation SPC should create an accelerating cycle of improving quality and reducing costs.
Add new comment