Statistics Article

Jody Muelaner’s picture

By: Jody Muelaner

One of the key ideas in lean manufacturing is that defects should be detected as early as possible. Efforts to control manufacturing processes, so that issues can be detected before defects occur, actually predate lean. Statistical process control (SPC) is a set of methods first created by Walter A. Shewhart at Bell Laboratories during the early 1920s. W. Edwards Deming standardized SPC for U.S. industry during WWII and introduced it to Japan during the American occupation after the war. SPC became a key part of Six Sigma, the Toyota Production System (TPS), and by extension, lean manufacturing.

SPC measures the outputs of processes, looking for small but statistically significant changes, so that corrections can be made before defects occur. SPC was first used within manufacturing, where it can greatly reduce waste due to rework and scrap. It can be used for any process that has a measurable output, and SPC is now widely used in service industries and healthcare.

Bill Snyder’s picture

By: Bill Snyder

In 1500, China’s economy was the strongest in the world. But by the 19th century, the United States, Western Europe, and Japan had leapfrogged over China by churning out goods and services in vast quantities while the former superpower stalled.

Why? Some economists argue that China’s lack of free markets and unencumbered innovation in the West led to the shift. But what is the relationship between innovation and markets, productivity, and inequality?

The answer to that puzzle and others were explored during a recent forum on the relationship of innovation to economic growth at the Hoover Institution. Three Stanford professors, all Hoover fellows—Stephen Haber, Edward Lazear, and Amit Seru—spoke on a panel moderated by Jonathan Levin, dean of Stanford Graduate School of Business.

Christopher Shoe’s picture

By: Christopher Shoe

According to a recent LNS Research survey, 37 percent of quality leaders cite an inability to measure quality metrics as their No. 1 barrier to achieving quality goals. Even worse, the survey showed four in five companies have poor visibility into real-time metrics.

These figures highlight a central problem in quality management: In an era of increasingly large data sets, how can manufacturers leverage these data for meaningful improvement?

It’s a question that’s especially relevant for manufacturers engaged in layered process audits (LPAs), a high-frequency verification strategy where teams conduct short audits every shift. With hundreds or even thousands of audits taking place during the course of a year, making sense of a large volume of data is a core challenge of LPA programs.

While plant managers often track metrics at a granular level, executives need to look at the data a little differently. Let’s look at four of the most important enterprise-level LPA metrics to track.

Multiple Authors
By: Yen Duong, Knowable Magazine

If you think it’s hard to tell how you’re doing at your job, imagine being a hockey goalie. Let’s say you block every shot in a game. Was that performance due to your superior skills? Or maybe just to a lack of skill in your opponents?

Evaluating ice hockey players' performance is getting easier, for goalies and their teammates. Advances in data collection—including video that can be slowed down and analyzed—and the application of more sophisticated statistics are allowing analysts to better assess how all players contribute to team performance on the ice. Among the more exciting outcomes are data-rich maps of the rink that can reveal especially successful shots or strategic passes.

“Back in the day, like decades ago, we could only really credit players for goals, and maybe assists and stuff like that,” says Namita Nandakumar, co-author of a recent review of trends in hockey analytics in the Annual Review of Statistics and Its Application. “Now research shows that there are other aspects of the game that you can be consistently better or worse at.”

Jody Muelaner’s picture

By: Jody Muelaner

In a general sense, capability is the ability to do something. Within manufacturing, capability is given a much more specific definition. It is an expression of the accuracy of a process or equipment, in proportion to the required accuracy.

This can be applied to production processes, in which case any random variation and bias in the process must be significantly smaller than the product tolerance. It can also be applied to measurements, where any uncertainties in the measurement must be significantly smaller than the product tolerance or process variation that is being measured.

Rohit Mathur’s picture

By: Rohit Mathur

Whatever the process or type of data collected, all data display variation. This is also true in software development. Any measure or parameter of interest to our business will vary from time period to time period, e.g., number of incidents per week or month, time taken in resolving incidents, number of tickets encountered in a production support environment per month, and defect density in code.

Understanding variation is about being able to describe the behavior of processes or systems over time. This variation can be stable, predictable, and routine, or unstable, unpredictable, and exceptional. Being able to distinguish between stable or common-cause variation, and unstable or special-cause variation, helps us to decide the type of action needed to improve the process. The control chart, developed by Walter Shewhart, is the tool that enables us to do so.

Multiple Authors
By: Romesh Saigal, Abdullah AlShelahi

Soon after the Great Recession, the U.S. stock markets plunged—and rebounded within 36 minutes. The Dow Jones Industrial Average dropped more than 9 percent, losing more than 1,000 points before suddenly recovering.

This May 6, 2010, event was the first recorded “flash crash.” Although it didn’t have long-term effects, it raised concerns among investors about the stability of the stock market.

Scott A. Hindle’s picture

By: Scott A. Hindle

In everyday language, “in control” and “under control” are synonymous with “in specification.” Requirements have been met. Things are OK. No trouble.

“Out of control,” on the other hand, is synonymous with “out of specification.” Requirements have not been met. Things are not OK. Trouble.

Using this language, an obvious axiom would be: Take action when the process is out of control.

The everyday use of in and out of control is, however, unfortunate for control charts, the major tool of statistical process control (SPC). Why? Because in SPC these terms speak of processes as being stable or unstable. To characterize a process as stable or unstable, process limits, from process data, are needed. Specification limits are not needed.

Given the easy-to-understand basis for the action of meeting or not meeting requirements, coupled with the risk of confusion over the terms in control and out of control, why use control charts? If you are curious to see some of the benefits in doing so, read on. Two case studies are used.

Case one: Part thickness

During a regular review meeting in Plant 17, in- and out-of-specification data on the thickness of part 64 were reviewed.

Multiple Authors
By: Chad Kymal, Gregory F. Gruska

During the early 1980s, GM, Ford, and Chrysler established the Automotive Industry Action Group (AIAG), a not-for-profit organization with the mission “To improve its members’ competitiveness through a cooperative effort of North American vehicle manufacturers and their suppliers.” In the late 1980s, U.S. automotive suppliers, through the auspices of the American Society for Quality (ASQ), approached the VPs of purchasing for GM, Ford, and Chrysler and explained the burden of multiple standards that were being imposed on the supply base. Not only where there multiple OEM standards, there were hundreds of tier one standards as well.

Jay Arthur—The KnowWare Man’s picture

By: Jay Arthur—The KnowWare Man

When I first learned quality improvement back in 1989 at Florida Power and Light, the consultants who trained us taught a very specific way to draw a Pareto chart. They’d been trained in Japan, the place where quality improvement first took root during the 1950s, so I took it for granted that the way they drew Pareto charts was the authentic and best way to do so.

A Pareto chart combines a bar graph with a cumulative line graph. Using the way we were taught to draw a Pareto chart (figure 1), the bars are touching, making it extremely easy to visually compare levels from one bar to the next. The bars span the entire available space along the x axis. The cumulative line graph springs from the bottom left corner of the first big bar, and each subsequent point is plotted from the corresponding top right corner of its bar.

Syndicate content