Statistics Article

Evan McLaughlin’s picture

By: Evan McLaughlin

As the vice president of quality for a $1.5 billion-dollar industrial corporation, Hermann Miskelly is responsible for leading its continuous improvement effort. Now in his 10th year of a lean Six Sigma deployment, he has overseen the execution of more than 4,000 major improvement projects and another 6,000 small improvement projects. Here are three key insights he shared about managing continuous improvement projects, with the help of Companion by Minitab®.

Optimize your management system—what could you do with an extra 120 hours to drive improvement?

You don’t just need to execute improvement projects. You need to improve the way that you manage them.

“Across all our operations and business units, these [Excel] spreadsheets required more than 120 man-hours each month to collate, analyze, and prepare for monthly executive reviews of the continuous improvement effort,” says Miskelly. “Overall, our management system was slow, cumbersome, and labor-intensive, and we were looking for a better way to execute and manage our continuous improvement effort.”

Minitab LLC’s picture

By: Minitab LLC

Anticipating challenges is always a daunting task for continuous improvement professionals. Unforeseen inefficiencies in process or defects in product development can throw timelines and associated costs into disarray. How to commit to realistic forecasts and timelines when resources are limited, or gathering real data is too expensive or impractical? Can simulated data be trusted for accurate predictions? That’s when Monte Carlo simulation comes in.

Simulated data actually are routinely used in situations where resources are limited, or gathering real data would be too expensive or impractical, though. Monte Carlo simulation is a mathematical modeling technique that allows you to see all possible outcomes and assess risk to make data-driven decisions. Historical data are run through a large number of random computerized simulations that project the probable outcomes of future projects under similar circumstances.

The Monte Carlo method uses repeated random sampling to generate simulated data to use with a mathematical model. This model often comes from a statistical analysis, such as a designed experiment or a regression analysis.

Suppose you study a process and use statistics to model it like this:

Larry Silverberg’s picture

By: Larry Silverberg

Some 20 years ago, my colleague Chau Tran and I developed a way to simulate the trajectories of millions of basketballs on the computer.

We went to the coaches and assistant coaches at North Carolina State University, where we are based, and told them we had this uncommon ability to study basketball shots very carefully.

Their first question was simple: “What’s the best free throw?” Should the shooter aim toward the front of the hoop or the back? Does it depend on whether the shooter is short or tall?

Math offers a unique perspective. It speeds up the amount of time it takes to see the patterns behind the best shots. For the most part, we discovered things that the players and coaches already knew—but every so often, we came across a new insight.

Simulating millions of shots

From a mathematical viewpoint, basketball is a game of trajectories. These trajectories are unique in that the ball’s motion doesn’t change much when it’s flying through the air, but then rapidly changes over milliseconds when the ball collides with the hoop or the backboard.

NIST’s picture

By: NIST

On February 14, 1929, gunmen working for Al Capone disguised themselves as police officers, entered the warehouse of a competing gang, and shot seven of their rivals dead. The St. Valentine’s Day Massacre is famous not only in the annals of gangland history, but also in the history of forensic science. Capone denied involvement, but an early forensic scientist named Calvin Goddard linked bullets from the crime scene to Tommy guns found at the home of one of Capone’s men. Although the case never made it to trial—and Capone’s involvement was never proved in a court of law—media coverage introduced millions of readers to Goddard and his strange-looking microscope.

That microscope had a split screen that allowed Goddard to compare bullets or cartridge cases, the metal cases a gun ejects after firing a bullet, side by side. If markings on the bullets or cases matched, that indicated that they were fired from the same gun. Firearms examiners still use that same method today, but it has an important limitation: After visually comparing two bullets or cartridge cases, the examiner can offer an expert opinion as to whether they match. But they cannot express the strength of the evidence numerically, the way a DNA expert can when testifying about genetic evidence.

Mike Richman’s picture

By: Mike Richman

Sustainable performance improvement is simply impossible without a firm handle on the precepts and tools of statistical process control (SPC). It is for this reason that we cover industrial statistics so frequently here at Quality Digest. After all, as the great Scottish physicist and engineer Lord Kelvin once said, “If you cannot measure something, you cannot improve it.”

With this in mind, I welcome the release of Process Capability Analysis: Estimating Quality (CRC Press/Taylor & Francis, 2018), the forthcoming book from Neil Polhemus. Although not necessarily written with beginners in mind, Polhemus’ work is nevertheless accessible to rank-and-file QA/QC professionals with an abiding interest in the inner workings of process improvement. I myself have no more than a journalist’s basic understanding of SPC, yet found this book’s central premise—i.e., to address “the problem of estimating the probability of nonconformities in a process from the ground up”—to be valid and valuable.

Bonnie Stone’s picture

By: Bonnie Stone

Lean, also known as “lean manufacturing” or “lean production,” focuses on maximizing customer value by removing waste and eliminating defects. Lean tools are about understanding the process, looking for waste, preventing mistakes, and documenting what you did. 

Let’s look at five lean tools used in process improvements, what they do, and why they’re important. Companion by Minitab can help you get started leveraging the tools of lean and other continuous improvement methods to thrive in your business. These tools are even more powerful if you can share and collaborate with your team, so try Companion’s online dashboard reporting capabilities as well.

1. Voice of the customer (VOC) summary

Click here for larger image.

Multiple Authors
By: Scott A. Hindle, Donald J. Wheeler

In theory, a production process is always predictable. In practice, however, predictable operation is an achievement that has to be sustained, which is easier said than done. Predictable operation means that the process is doing the best that it can currently do—that it is operating with maximum consistency. Maintaining this level of process performance over the long haul can be a challenge. Effective ways of meeting this challenge are discussed below.

Some elements of economic operation

As argued in “What Is the Zone of Economic Production?”, to speak of the economic operation of a manufacturing process, all of the following elements are required:
Element 1: Predictable operation
Element 2: On-target operation
Element 3: Process capability achieved (Cp and Cpk ≥ 1.5)

The notions of on-target operation and process capability are inextricably linked to predictable operation—i.e., demonstrable process stability and consistency over time. Without stability and consistency over time it is impossible to meaningfully talk about either capability or on-target operation.

John Flaig’s picture

By: John Flaig

Story update 9/26/2017: The words "distribution of" were inadvertently left out of the last sentence of the second paragraph.

Some practitioners think that if data from a process have a “bell-shaped” histogram, then the system is experiencing only common cause variation (i.e., random variation). This is incorrect and reflects a fundamental misunderstanding about the relationship between distribution shape and the variation in a system. However, even knowledgeable people sometime make this mistake.

Ville Satopaa’s picture

By: Ville Satopaa

At a 1906 livestock show in Plymouth, England, nearly 800 people participated in a contest to guess the weight of a slaughtered ox. The average of these estimates was 1,197 pounds. This is remarkable because the true weight of the ox turned out to be 1,198 pounds. The average was only one pound away from the truth. How could it be so accurate? Perhaps by chance?

Barbara A. Cleary’s picture

By: Barbara A. Cleary

If you get off the highway and take an alternate route when traffic slows to one lane, you are making a prediction. Likewise, if you decide to invite someone to dinner, that too is a prediction. The scientific method? Predictive in nature. Every time you make a decision, you are making a prediction of an outcome, and choosing one over another based on this prediction.

Prediction skills become second nature because of this daily application. These predictions may not be based on data or evidence, but involve some subjective guess about a preferred outcome. In the case of choosing a traffic route or a dinner date, it’s clear that not much data are involved. The decision involves subjective interpretations, intuitive hunches, and guesses about potential outcomes.

Will data analysis really enhance prediction accuracy? There are no guarantees without adding a certain amount of understanding of data, of variation, and of process performance.

Syndicate content