Featured Product
This Week in Quality Digest Live
Quality Insider Features
Constance Noonan Hadley
The time has come to check whether the benefits of teamwork still outweigh the costs
Lily Chen
The cornerstone of cybersecurity
Jeremy L. Boerger
To keep your business running, you need visibility into your IT assets
Elizabeth Gasiorowski Denis
An inclusive approach to designing products and services guarantees accessibility to as many consumers as possible
Naresh Pandit
Enter the custom recovery plan

More Features

Quality Insider News
Sapphire XC will ship in late Q3 beginning with aerospace companies
Major ERP projects take six months longer than companies were told
Program inspires leaders to consider systems perspective for continuous improvement and innovation
Collaboration produces online software for collecting quality inspection data
Serving the needs of employers and educators
Powder reuse schemes affect medical device performance
MIT course focuses on the impact of increased longevity on systems and markets
Upgraded with blue laser technology
Delivers time, cost, and efficiency savings while streamlining compliance activity

More News

Davis Balestracci

Quality Insider

A More Robust ‘P’ for Any PDSA Cycle

An excellent plan is essential

Published: Tuesday, April 14, 2015 - 17:59

Good data collection requires eight questions to be addressed. The first four involve designing the collection.

Most of these issues were discussed in my last column. To summarize:
1. Why collect the data?
• Is there a clear objective for this collection?

2. What method(s) will be used for the analysis?
• This should be known even before one piece of data is collected.

3. What data will be collected?
• What specific process output(s) do you wish to capture?
• If data are needed from customers, patients, or staff, what is the process for choosing the individuals from whom to obtain the needed data?

4. How will the data be measured?
• How will you evaluate any output to obtain a consistent number, regardless of who measures it?
• In the case of measuring counts, is the threshold between a nonevent (x = 0) and an event (x = 1) clear (e.g., is Pluto a planet or isn’t it?).
• Remember, there is no “true value,” and it depends on the specific objective.
• Does it allow you to take the actions you wish to take?

Because human variation can compromise the quality of data and render any subsequent analysis virtually useless for project purposes, these questions help to anticipate and minimize its effects.

Four more questions: reducing human variation in collection logistics

In any process, the collection process itself has the classic six sources of process inputs: people, methods, machines, materials, measurements, and environments. It’s vulnerable to outside variation that compromises its quality.

Once you’ve answered the four questions above, you’ve got a great data design. However, one can’t ignore the need to formally plan for data consistency and stability. Especially in this area, the unintended ingenuity of human psychology to sabotage even the best of designs is a most formidable force to overcome.

Despite the best design, many problems lurk in the ensuing collection process. It’s only when the human variation in the collection process is also minimized that data can be trusted enough to allow appropriate action on the process being improved.

The remaining four questions:
5. How often will the data be collected?
• Will any collector’s perception of how often you want the data recorded be a barrier to cooperation, risking incomplete data?
• Are you asking for more data than are needed for the current objective?

6. Where will the data be collected?
• Where is the best leverage point in the process to collect the data?
• Where will job flows suffer minimum interruption?

To address these next two questions, one must consider even deeper logistics of obtaining the data. For this, understanding the data collectors and their environment is crucial.
7. Who will collect the data?
• What effect will their normal job have on the proposed data collection?
• Is this person unbiased, and does she have easy and immediate access to the relevant facts?
• Will the collector’s perception of how the data are going to be used be a barrier to cooperation, or risk data being incomplete?

8. What training is needed for the data collectors?
• What would minimize the risk of data being incomplete?
• Exactly how will the data be recorded?

Then there remains the logistics of actually recording the data

Seriously consider involving some of the data collectors in the design of any data collection forms. Do the forms allow efficient recording of data with minimum interruption to people’s normal jobs?

To allow this, there are further issues for improvement facilitators to keep in mind:
• Reducing opportunities for error
• Designing traceability to collector and environment
• Making the data collection form virtually self-explanatory, professional looking, and simple
• Formally training all the people who will be involved. Have a properly completed data collection form available to use as reference. As part of this training, reduce fear by discussing the importance of complete and unbiased information.

Then be sure to answer the following questions that the collectors will naturally ask:
• “What is the purpose of the study?”
• “What are the data going to be used for?”
• “Will the results be communicated to us?”

Finally, test the data collection process to expose most remaining sources of inappropriate and unintended variation and attempt to remove them. Pilot the forms and instructions on a small scale.

After the (brief) test, revise the data collection processes if necessary, using input from the collectors:
• Do the processes work as expected?
• Are the collection forms filled out properly?
• Do people have differing perceptions of operational definitions?
• Are the processes as easy to use as originally perceived?

When possible, sit with the people collecting the data and observe them. As new people enter the collection process, have someone who knows what to do watch the first attempts of novice data collectors.

You’re now as ready as you’ll ever be to begin your formal data collection. But again, be forewarned. The human factor is always lurking—via W. Edwards Deming’s funnel experiment rule No. 4: unintentional “random walk” from original intentions. Throughout the collection, audit for missing data, unusual values, and possible bias. Occasionally observe the actual collection to continue improving measurement consistency and stability. And don’t be surprised by anything you observe.

Get the respect you deserve. People involved in the actual data collection should have confidence that you and your team know exactly what you’re asking and looking for—and that you’re actually going to do something with the information. People perceive their jobs as already taking up at least 100 percent of their time; they’re doing you a favor, so make it easy for them. Make sure the data collection results ultimately make their work lives easier as well. If you do, your future projects certainly will be easier, too.


About The Author

Davis Balestracci’s picture

Davis Balestracci

Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.