Modernizing Your SPC System Can Drive Huge Improvements in Quality and Cost
Just a few decades ago, today’s personal technology was a science fiction pipe dream.
Just a few decades ago, today’s personal technology was a science fiction pipe dream.
Parts 1, 2, and 3 of our series on statistical process control (SPC) have shown how data can be thoughtfully used to enable learning and improvement—and consequently, better product quality and lower production costs.
Fourteen years ago, I published “Do You Have Leptokurtophobia?” Based on the reaction to that column, the message was needed.
“Information Overload” Credit: James Marvin Phelps
Data overload has become a common malady. Modern data collection technologies and low-cost database storage have motivated companies to collect data on almost everything. The result? Data overload.
We are one year away from the 100th anniversary of the creation of the control chart: Walter Shewhart created the control chart in 1924 as an aid to Western Electric’s manufacturing operations.
In last month’s column, we looked at how process-hyphen-control algorithms work with a process that is subject to occasional upsets.
Today’s manufacturing systems have become more automated, data-driven, and sophisticated than ever before.
Many articles and some textbooks describe process behavior charts as a manual technique for keeping a process on target.
As we learned last month, the precision to tolerance ratio is a trigonometric function multiplied by a scalar constant. This means that it should never be interpreted as a proportion or percentage.
Credit: Mathieu Turle on Unsplash
The success run theorem is one of the most common statistical rationales for sample sizes used for attribute data.
© 2025 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute Inc.