Statistics Article

Christopher Shoe’s picture

By: Christopher Shoe

According to a recent LNS Research survey, 37 percent of quality leaders cite an inability to measure quality metrics as their No. 1 barrier to achieving quality goals. Even worse, the survey showed four in five companies have poor visibility into real-time metrics.

These figures highlight a central problem in quality management: In an era of increasingly large data sets, how can manufacturers leverage these data for meaningful improvement?

It’s a question that’s especially relevant for manufacturers engaged in layered process audits (LPAs), a high-frequency verification strategy where teams conduct short audits every shift. With hundreds or even thousands of audits taking place during the course of a year, making sense of a large volume of data is a core challenge of LPA programs.

While plant managers often track metrics at a granular level, executives need to look at the data a little differently. Let’s look at four of the most important enterprise-level LPA metrics to track.

Multiple Authors
By: Yen Duong, Knowable Magazine

If you think it’s hard to tell how you’re doing at your job, imagine being a hockey goalie. Let’s say you block every shot in a game. Was that performance due to your superior skills? Or maybe just to a lack of skill in your opponents?

Evaluating ice hockey players' performance is getting easier, for goalies and their teammates. Advances in data collection—including video that can be slowed down and analyzed—and the application of more sophisticated statistics are allowing analysts to better assess how all players contribute to team performance on the ice. Among the more exciting outcomes are data-rich maps of the rink that can reveal especially successful shots or strategic passes.

“Back in the day, like decades ago, we could only really credit players for goals, and maybe assists and stuff like that,” says Namita Nandakumar, co-author of a recent review of trends in hockey analytics in the Annual Review of Statistics and Its Application. “Now research shows that there are other aspects of the game that you can be consistently better or worse at.”

Jody Muelaner’s picture

By: Jody Muelaner

In a general sense, capability is the ability to do something. Within manufacturing, capability is given a much more specific definition. It is an expression of the accuracy of a process or equipment, in proportion to the required accuracy.

This can be applied to production processes, in which case any random variation and bias in the process must be significantly smaller than the product tolerance. It can also be applied to measurements, where any uncertainties in the measurement must be significantly smaller than the product tolerance or process variation that is being measured.

Rohit Mathur’s picture

By: Rohit Mathur

Whatever the process or type of data collected, all data display variation. This is also true in software development. Any measure or parameter of interest to our business will vary from time period to time period, e.g., number of incidents per week or month, time taken in resolving incidents, number of tickets encountered in a production support environment per month, and defect density in code.

Understanding variation is about being able to describe the behavior of processes or systems over time. This variation can be stable, predictable, and routine, or unstable, unpredictable, and exceptional. Being able to distinguish between stable or common-cause variation, and unstable or special-cause variation, helps us to decide the type of action needed to improve the process. The control chart, developed by Walter Shewhart, is the tool that enables us to do so.

Multiple Authors
By: Romesh Saigal, Abdullah AlShelahi

Soon after the Great Recession, the U.S. stock markets plunged—and rebounded within 36 minutes. The Dow Jones Industrial Average dropped more than 9 percent, losing more than 1,000 points before suddenly recovering.

This May 6, 2010, event was the first recorded “flash crash.” Although it didn’t have long-term effects, it raised concerns among investors about the stability of the stock market.

Scott A. Hindle’s picture

By: Scott A. Hindle

In everyday language, “in control” and “under control” are synonymous with “in specification.” Requirements have been met. Things are OK. No trouble.

“Out of control,” on the other hand, is synonymous with “out of specification.” Requirements have not been met. Things are not OK. Trouble.

Using this language, an obvious axiom would be: Take action when the process is out of control.

The everyday use of in and out of control is, however, unfortunate for control charts, the major tool of statistical process control (SPC). Why? Because in SPC these terms speak of processes as being stable or unstable. To characterize a process as stable or unstable, process limits, from process data, are needed. Specification limits are not needed.

Given the easy-to-understand basis for the action of meeting or not meeting requirements, coupled with the risk of confusion over the terms in control and out of control, why use control charts? If you are curious to see some of the benefits in doing so, read on. Two case studies are used.

Case one: Part thickness

During a regular review meeting in Plant 17, in- and out-of-specification data on the thickness of part 64 were reviewed.

Multiple Authors
By: Chad Kymal, Gregory F. Gruska

During the early 1980s, GM, Ford, and Chrysler established the Automotive Industry Action Group (AIAG), a not-for-profit organization with the mission “To improve its members’ competitiveness through a cooperative effort of North American vehicle manufacturers and their suppliers.” In the late 1980s, U.S. automotive suppliers, through the auspices of the American Society for Quality (ASQ), approached the VPs of purchasing for GM, Ford, and Chrysler and explained the burden of multiple standards that were being imposed on the supply base. Not only where there multiple OEM standards, there were hundreds of tier one standards as well.

Jay Arthur—The KnowWare Man’s picture

By: Jay Arthur—The KnowWare Man

When I first learned quality improvement back in 1989 at Florida Power and Light, the consultants who trained us taught a very specific way to draw a Pareto chart. They’d been trained in Japan, the place where quality improvement first took root during the 1950s, so I took it for granted that the way they drew Pareto charts was the authentic and best way to do so.

A Pareto chart combines a bar graph with a cumulative line graph. Using the way we were taught to draw a Pareto chart (figure 1), the bars are touching, making it extremely easy to visually compare levels from one bar to the next. The bars span the entire available space along the x axis. The cumulative line graph springs from the bottom left corner of the first big bar, and each subsequent point is plotted from the corresponding top right corner of its bar.

Ryan E. Day’s picture

By: Ryan E. Day

Current business conversation often focuses on data and big data. Data are the raw information from which statistics are created and provide an interpretation and summary of data. Statistics make it possible to analyze real-world business problems and measure key performance indicators that enable us to set quantifiable goals. Control charts and capability analysis are key tools in these endeavors.

Control charts

Developed in the 1920s by Walter A. Shewhart, control charts are used to monitor industrial or business processes over time. Control charts are invaluable for determining if a process is in a state of control. But what does that mean?

William A. Levinson’s picture

By: William A. Levinson

Anthony Chirico1 describes how narrow-limit gauging (NLG, aka compressed limit plans) can reduce enormously the required sample size, and therefore the inspection cost, of a traditional attribute sampling plan. The procedure consists of moving acceptance limits t standard deviations inside the engineering specifications, which increases the acceptable quality level (AQL) and therefore reduces the sample size necessary to detect an increase in the nonconforming fraction.

Syndicate content