Cost for QD employees to rent an apartment in Chico, CA. $1,200/month. Please turn off your ad blocker in Quality Digest
Our landlords thank you.
Davis Balestracci
Published: Monday, January 14, 2019 - 13:03 “People think that if you collect enormous amounts of data you are bound to get the right answer. You are not bound to get the right answer unless you are enormously smart.” There has been an explosion in new technology for acquiring, storing, and processing data. The “big data” movement (and its resulting subindustry, data mining) is becoming more prevalent and having major effects on how quality professionals and statisticians—and everyone else—do their jobs. Big data is a collection of data sets that are too large and complex to be processed using traditional database and data-processing tools. Any change this big will require new thinking. Rocco Perla, a colleague for whom I have the utmost respect, feels that even though there is now unprecedented attention and focus on analytics and data-driven decision making, it has also introduced a number of challenges. It is poor practice to rely on whatever data happen to be available or to assume sophisticated analytics can overcome poor data quality. The fact that data reside in electronic files says nothing regarding the quality of the data. Observational data are teeming with reproducibility issues, especially if they are a resulting merge of many different sources. Intuitively, as the amount of information increases, one would think that the degree of confusion decreases. But this is true only to a point because the situation eventually reverses, reaching a point where more information leads to increasing confusion—low amounts of information and high amounts of information will both lead to a high degree of confusion! There are real economic implications of information overload—including a loss of up to 25 percent of the working day for most knowledge workers. W. Edwards Deming often made the point that information is not knowledge. The speed of today’s worldwide instant communication does not help anyone to understand the future and the obligations of management. Do we really need constant updating to cope with the rapidly changing future? It could hardly be accomplished by watching every moment of television or reading every newspaper! Perla introduces a concept he calls “data maturity” to begin to make sense of all the available data, especially with the growing sub-industry of project work inherent in many improvement philosophies. It has the following five characteristics. 1. Projects and their supporting data (e.g., reports, dashboards) are viewed as a resource expenditure that adds various costs and complexity for needed support, which should be only temporary. 2. Projects don’t go on forever. Any project needs formal closure to be retired, after which its measures are also retired. 3. All measures are operationally defined. It is not splitting hairs to formally define adjectives such as “good,” “reliable,” “unsafe,” or “unemployed” so that, regardless of the person measuring the situation, he would either come up with the same literal number or decision, e.g., thresholds where an environment goes from “safe” (x = 0) to “unsafe” (x = 1). Even though people may not agree on the actual definition, they will agree to use it and continue to use it as long as the ability to make the desired decision inherent in the objective of the collection remains strong. The objective here is to reduce the “human variation in perception” factor, which will contaminate any data and render them virtually useless. Management cannot knowingly allow for idiosyncratic or capricious interpretation of a measure, especially if there is potential for distortion to meet an arbitrary goal. For example, is Pluto a planet (x = 1) or isn’t it (x = 0)? It depends: What is the objective for asking this question? (Never mind how you feel about it!) 4. Improvement measures are clearly and explicitly linked to any changes being tested in the system—and they are collected frequently over a period of time. 5. The dominant form of analysis includes charts of data collected over time to determine, and react appropriately to, common and special causes of variation. In my experience, virtually without exception, these ideas begin to seriously challenge the way leaders and organizations currently think about information, data, and decision making. Environmental pressures remain constant, and technology is not going to slow down. This creates more dependence on the tremendous inertia of the status quo and a toxic blindness to the deceptive power of this different approach’s counter-intuitive simplicity. This blindness makes people especially prone to... the most seductive waste of all? Hand-in-hand with the explosion of big data will be an industry trying to sell you solutions for analyzing it. In fact, a client of mine sent me the figure below: “Eye Chart” not to be confused with “I-chart” (control chart for individuals) In this case, the vendor promises highly visual, easy-to-understand information that will allow insight into not only what the level of employee engagement is, but also the level of leadership effectiveness within each department. The vendor claims its analyses are statistically robust and repeatable over many years (and I’m willing to bet that, unfortunately, most of them use this all too common—and wrong—analysis). Note how they “delight their customers” by also throwing in red, yellow, and green color-coding. Another claim that is possibly some good news, but for an entirely different reason: This one tool is a fraction of the cost of a typical survey. Well, at least you could save money on all those ongoing, silly customer satisfaction surveys—how many of your electronic data files are filled with those? Data have their place, but the challenge is to maximize their ability to serve us humans with all our limitations—not the reverse. Perla concludes that the most effective future leaders will leverage this approach to data with a vision, energy, intellect, and moral compass that come from within—not so much from a report or balanced scorecard. Let’s stop the “data tail” from wagging the process dog, shall we? Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Davis Balestracci is a past chair of ASQ’s statistics division. He has synthesized W. Edwards Deming’s philosophy as Deming intended—as an approach to leadership—in the second edition of Data Sanity (Medical Group Management Association, 2015), with a foreword by Donald Berwick, M.D. Shipped free or as an ebook, Data Sanity offers a new way of thinking using a common organizational language based in process and understanding variation (data sanity), applied to everyday data and management. It also integrates Balestracci’s 20 years of studying organizational psychology into an “improvement as built in” approach as opposed to most current “quality as bolt-on” programs. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet.Watch Out for the Big Data Steamroller
The most seductive waste of all
—Bradley EfronData maturity = data sanity
No more automatic “data on demand.” Organizations that are not data mature often view requests for data as going into a magical black box that, with limited input, thought, and resources, is able to produce the desired end-product perpetually.
Any transition from diagnostic and testing data to the consideration of collecting data to hold any gains should be done deliberately and with thought. It is not unusual in many organizations for data, reports, and projects to reach a point where they are never or rarely looked at by the requester in the extended future. If one is not careful, new measures become additive to old measures and are allowed to continue in perpetuity and to clutter the environment.
In data-mature organizations, operational definitions are viewed as sacred because they are the only way to ensure a shared understanding of the work to be done.
The purpose of improvement is to understand—as quickly as possible—if the changes made to a system are leading to improvement and to inform the next test. Many improvement practitioners shy away from annotated run and control charts because direct causal links between changes and outcomes often are not possible.
Organizations that are data mature understand the importance of examining data over time and using these charts as developed by Walter Shewhart (mainly control chart for individuals) to distinguish between special and common cause variation. These charts are based in what are called analytic statistics, and neither resemble what is taught in most academic courses (enumerative statistics) nor involve formal hypothesis testing.How does one filter all this information and make sense of it?
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Davis Balestracci
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.