That’s fake news. Real news COSTS. Please turn off your ad blocker for our web site.
Our PROMISE: Our ads will never cover up content.
Davis Balestracci
Published: Monday, March 16, 2015 - 13:17 In my last column, I discussed how even a well-designed study with a statistically significant result doesn’t necessarily mean viability in the real world. Post-study, one must study the manifestations of variation on the result in any environment in which the result is applied—and each environment will have its own unique variation.
This type of study requires the use of analytic statistical methods, which are designed to deal specifically with this type of variation by exposing it and monitoring attempts to reduce that which is inappropriate and unintended. Not only will variation come into play in the process of applying any result, but it also affects the four statistical data processes: In formal research such as clinical trials, these are tightly controlled (and subsequently ignored) via well-known enumerative statistical methods. However, post-study, not only is control on the method of applying the result significantly lessened, but environmental variation (due to work culture) will also come into play in each of the four data processes. I used whitewater rapids as an analog for a real-world environment. In the cases of applying a research result or even the simplest rapid cycle plan-do-study-act (PDSA) design: Besides the subsequent study of the variation of the actual application process, do people even consider studying the variation in each of the four data processes? If not, there is a very real danger: People could act on the basis of interpreting variation due solely to any or all of the four data processes. When these processes are ad hoc and not formally designed, there is real danger for variation from these data processes to overshadow and cloud the study of any variation (beneficial or otherwise) caused by the actual process being tested. Cursory consideration of the environmental (i.e., cultural) effects of variation on the four data processes inevitably results in vague plans leading to vague data and vague results. Wouldn’t it be easier to test the study if the data processes had their variation minimized as part of the plan so as not to cloud the interpretation of the tested process application? This is a most nontrivial process, which I will deal with in the rest of this column and next time. Each of the four data processes has the exact same six sources of process inputs we’ve all been taught: people, methods, machines, materials, measurements (literal numbers), and environments. Each also has an inherent “quality” associated with it, and its inputs are subject to the same influences of outside variation as the process being tested. All of this can potentially compromise the quality of the study. What is the biggest danger? Human variation, which includes cultural perception of the planned study and how it affects executing the test and “methods” of measurement, collection, analysis, and interpretation. In that context, let’s consider some key questions for each of these four data processes. Measurement. Are the data operationally defined in line with clear objectives? W. Edwards Deming was fond of saying, “There is no true value of anything.” Collection. Consider your plan for collecting these data: Analysis. Are you aware that your analysis should be known before one piece of data has been collected? When one applies a specific statistical technique, there’s an underlying assumption that the data were collected in the specific way that makes the analysis appropriate. The danger here is that the computer will do anything you want—whether the data were collected appropriately or not. Imagine you have the data in hand: Interpretation. Statistics is not a set of techniques used to “massage” data. Proper use allows more proactive interpretation of the variation on which you must take appropriate action and ultimately make predictions. The danger: Any variation is one of two types, and treating one as the other can actually make the situation worse—certainly no better. Human perception of variation and how they execute the “methods” of the four data processes—measurement, collection, analysis, and interpretation—can compromise the quality of data and cloud the effect of the tested change, rendering any subsequent analysis virtually useless. This must be anticipated and minimized as part of any study’s plan—to the point of being tested separately before any formal test on a potentially beneficial change. Eight questions need to be addressed, which I will talk about next time. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.Are You Unknowingly Reacting to the Data Process?
Everyday variation lurks in many guises
1. Measurement definition
2. Data collection
3. Analysis
4. Interpretation of the analysisHuman variation rears its ugly head yet again
• What is the concept you are trying to evaluate?
• What data will allow you to attach a value to this concept?
• By what standards or measures will you judge it?
• Can you write down clear descriptions of how to measure the characteristic?
• What are some factors that might cause measurements of the same item or situation to vary?
• In the case of measuring discrete events, is the threshold of occurrence between a “nonevent” (0) and an “event” (1) clear and understood?
• How can you reduce the varied perceptions and ultimate impact of these factors?
• Will the data collectors have to take samples? If so, how often, how many, and where?
• If people must be sampled, specifically who should be chosen? (See my last column about the fallacy of random sampling in analytic statistics, starting at: “For example, we may take a group of patients....”)
• How will these data be recorded?
• Can you design a data sheet (i.e., check sheet) to record the data as simply as possible?
• Another issue to consider in many cases: Do your customers or suppliers (both internal and external) collect the same kind of data? If so, what procedures or instruments do they use? Are your definitions, standards, and procedures comparable to those used by customers and suppliers?
• What could these data tell you?
• What will you do with the data?
• What specific statistical technique(s) will you use?
• Were the data collected in a way that makes this analysis appropriate?
• What will you do after that?
• Would another kind of data be more helpful?Want to avoid ‘vague?’ Then study the data process as part of the plan
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Davis Balestracci
© 2022 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.