Quality Digest      
  HomeSearchSubscribeGuestbookAdvertise April 26, 2024

 

This Month
Home
Articles
ISO 9000 Database
Columnists
Departments
Software
Contact Us
Web Links
Web Links
Web Links
Web Links
Web Links
Need Help?
Web Links
Web Links
Web Links
Web Links
ISO 9000 Database
ISO 9000 Database


Departments: SPC for the Real World

  
   
Davis Balestracci

Ten Fundamentals of Variation

These work for all personality types.

 

 

Just about everyone has taken a Myers-Briggs inventory somewhere along the line. I'll confess my diagnosis: I'm INFP (introverted-intuitive-feeling-perceiving), which is no doubt frustrating for the I/ESTJs (introverted/extroverted-sensing-thinking-judging) individuals who read this column. If some of you are intrigued as to what this means, check out the Web site www.keirsey.com.

I know it seems at times that I'm "all over the map," so I thought it was time to summarize my philosophy for using statistics in quality improvement. Hopefully you'll see the method behind the madness.

So, here are my "Ten Fundamentals of Variation":

1. Good data collection requires planning, which is as important as the data itself. The first question must be, "What's the objective?" Quality improvement statistics isn't "massaging" reams of data. The goal is simple, efficient collection . In fact, you should know your proposed analysis before one piece of data is collected.

2. Good data analysis requires knowing how the data were or will be collected.

The analysis must be appropriate for the method of collection.

Raw data say little

Graphical methods are the first methods of choice

You must always ask, "How were these data collected?"

 

When you do an analysis, you're making an implicit assumption that it was collected in a way that makes your proposed analysis appropriate. The computer will do anything you want. Bar graphs, stacked bar graphs and trend lines are virtually worthless.

3. All data result from a measurement process.

Is the measurement process agreed upon and reliable?

It's said that Walter Shewhart actually thought his work on operational definitions was more important than control charts. No definition will ever be perfect, but it must be agreed upon or human variation will contaminate your data. Crude measures of the right things are better than precise measure of the wrong things. As long as an operational definition is "consistently inconsistent," you can work with the data it produces.

4. Variation exists in all things, but may be hidden by:

Excessive rounding

Excessive aggregation

Rolling averages

 

There's variation in everything. It's amazing how people (especially financial personnel) "invent" techniques to "make it go away." It doesn't. These techniques sometimes hide special causes… or make them appear when there aren't any.

5. All data occur as an output from some process and contain variation.

This variation is caused and has sources that can be better identified through proper data collection.

6. There are sources of variation due to the inputs to a process (e.g., people, methods, machines, materials, measurement and environment) and variation in time for these individual inputs as well as their aggregate.

Both of these are reflected in the output characteristic being measured.

7. The stability of the process (over time) producing the data is of great importance for prediction or taking future action.

Everything is a process. You must first assess the process producing any set of data (over time) to separate common causes from special causes, and then proceed with appropriate action (i.e., stratification by process input). Any good analysis generates "the next question" toward further understanding the variation you're facing, helps you eliminate inappropriate and unintended variation, and ultimately leads to a more predictable process. There are no "sample size" issues in quality improvement. The need is for more frequent samples over time to increase one's belief in the predictions being made.

8. All data occur in time.

Neglecting the time element may lead to invalid statistical conclusions. You'll never go wrong by doing an initial time plot of any set of data to assess the process that produced it.

9. Any source of variation can be classified as a common cause or a special cause.

It's important to distinguish one from the other to take appropriate action. The presence of special causes of variation may invalidate the use of certain statistical techniques.

10. Variation (i.e., uncertainty) occurs even when an object is measured only once.

As a wise person once said, "A person with one clock knows what time it is. A person with two clocks isn't so sure."

 

There you have it: A summary of the "method to Balestracci's madness" in these columns. To my cherished SJ-temperament colleagues, trust me: It's good to get pushed out of your comfort zones. In quality, "logic" isn't always persuasive (and only logicians use it as a source of income).

About the author
Davis Balestracci is a member of the American Society for Quality and past chair of its statistics division. He's also a dynamic speaker with unique and entertaining insights into the places where process, statistics and quality meet. Visit his Web site at www.dbharmony.com .