Featured Product
This Week in Quality Digest Live
Quality Insider Features
Akhilesh Gulati
To solve thorny problems, you can’t have either a purely internal or external view
Daniel Croft
Noncontact scanning for safer, faster, more accurate, and cost-effective inspections
National Physical Laboratory
Using Raman spectroscopy for graphene and related 2D materials
Ashley Hixson
Partnership with Hexagon’s Manufacturing Intelligence division provides employable metrology skills
Lily Jampol
Here’s why that’s a problem

More Features

Quality Insider News
Makes it easy to perform all process steps, from sample observation to data analysis
General, state-specific, and courses with special requirements available
New features revolutionize metrology and inspection processes with nondimensional AI inspection
Annual meeting in Phoenix, April 26–28
Engineering and computer science students receive new lab and learning opportunity
Strategic partnership expands industrial machining and repair capabilities
Supports robots from 14 leading manufacturers
AI designed to improve productivity and processes

More News

Paul Naysmith

Quality Insider

It’s Easy to Poke Holes in Something

Especially when it already has holes

Published: Monday, October 29, 2012 - 12:29

Cheese is by far one of the greatest foods. It is my only ambrosia, wrapping around my taste buds and sending fireworks of pleasure around my brain. In particular, I love the nutty flavor of Switzerland's holiest of cheeses: Emmental. When you meet me, I will happily bore you into a coma when I start talking about cheese.

Today it isn't Emmental or any other real cheese that has stimulated my quality receptors; it's a management theory that you may be familiar with: the Swiss cheese model (aka "Reason's dynamics of accident causation model").

If you've ever taken part in a failure investigation, you may have seen a diagram, or even produced the diagram yourself, that illustrates all the failure points in the system that produced the undesirable outcome. A perfect storm of a problem, the failure points were precisely aligned in layer upon layer of Swiss cheese slices, creating an aperture through which a bullet could pass without resistance, hitting the failure target.

Figure1: Adapted from the "dynamics of accident causation" model, or Swiss Cheese model, from James Reason, 1990. Click for larger image.

During the 1990s, James Reason, a professor at the University of Manchester in the United Kingdom, did extensive research on the psychology of human nature and wrote a textbook. I refer to his book on a weekly basis. Although he is more of a guru of safety, I firmly believe that Reason's research should be read and applied to the field of general business. And his research should be embraced by quality professionals particularly.

Granted, if you spend time with safety professionals, many will be very familiar with his research. That's because it is used to express many different failings in an accident. However, the Swiss cheese model, as useful as it is to express the causal factors or failures in the defense mechanism, is not the element of Reason's research that aids me in my daily pursuit of excellence. It is something more fundamental; it's Chapter 7 from his book, Human Error (Cambridge University Press, 1990), "Latent errors and systems disasters."

I attribute my idea or concept of systems' thinking to W. Edwards Deming's teachings. I am a card-carrying member of the Devoted to Deming fan club. At least once a year, I read a book about Deming, or read a book he wrote for the third or fourth time. I now have the ability to recite passages of his work with the passion of a religious leader. Deming introduced to the business world the idea that a business is like a delicate ecosystem, where a small change in one area will effect change across the entire business. He called upon business leaders to pull down the walls between departments, and to let ideas flow openly, to create an environment where everyone works together for the benefit of the company or the system.

Deming's ideas would attribute the failings in a business not to the employee but to a failure in the system that the employee was working in. He recommended focusing on the system, which is owned by management rather than the employee working in the system. This was always a core theme in Deming's philosophy. Deming, however, does have his critics when it comes to systems thinking.

In the book, False Prophets: The Gurus Who Created Modern Management and Why Their Ideas Are Bad For Business Today (Basic Books, 2003), James Hoopes writes that Deming's ideas are Utopian and perhaps naive. I also believe that business R&D ideas—or any concept—can or will become antiquated. However, one of the weaknesses in Deming's systems thinking is the fleshy, unpredictable element of any business: the human factor. I believe Deming's concepts can be complemented or enhanced with Reason's research into "human error."

No doubt at this point many proficient quality professionals are throwing down the poka-yoke flag on the field of play. I like poka-yoke; I like all of Shingo's work; however, mistake proofing does have limitations. It may not always be possible to mistake-proof everything in this world. We are limited by knowledge, engineering, technology, time, and money. Or if you are working in a service business, you are dealing with human decision-making ability. Given these limitations, we will default to relying solely on the human factor—the fallible natural human factor, with more variation, more complexity, or challenges to consistent levels of quality excellence.

When I read an investigation report, nothing grabs me more than seeing the root cause being attributed to human error. I was taught to never accept this and to ask the question: "What caused the human to err?" A good question to ask, but what is the correct answer? This set me on my quest to understand more about human error, and I introduced myself to Reason's work.

If you follow my writing, you may recall that one of my hobbies is to read investigation reports from different industries. I do this primarily as entertainment and to learn. Recently I was made aware of a failure investigation where it was clear that the individual involved knowingly violated the system designed to prevent failure. His company expended a great deal of attention and resources on training, education, competency, technology, procedures, and supervision to prevent a quality problem. In this instance, however, human error was directly to blame. So I asked, what caused the human to err? Granted, I wasn't privy to all the details of the event, but when I reviewed this report, I immediately pulled Reason's book from my library.

In Chapter 7 of Human Error, Reason discusses "unsafe acts." In the failure example above, I'll interpret unsafe acts as "acts that lead to poor quality." In section 10.2.4, Reason writes, "An unsafe act is more than just an error or a violation—it is an error or a violation committed in the presence of a potential hazard." Therefore if the hazards of poor quality are present when a poor-quality act is done by an individual, it will only lead to an undesired outcome. The slices of Swiss cheese have aligned, and the bullet has passed through without deflection.

In the same section, Reason published a model that describes this graphically and more simply than I can express in writing:

Figure 2: James Reason's summary of the psychological varieties of unsafe acts, classified initially according to whether the act was intended or unintended, and then distinguishing the errors from violations—Human Error (Cambridge University Press, 1990). Click for larger image.

So what would my learning be from this investigation and research? We all have this thing called "decision making" or "free will," and it is here I recommend that this has to be addressed. If we can help people make better and more informed decisions in the presence of poor-quality hazards (presuming poor-quality hazards cannot be eliminated), then we will prevent an undesired outcome. Some call this culture change; others see it as systems thinking. I view this as maturity.

All companies or businesses are on a journey; the learning business will thrive and develop, or as Deming put it, "Institute a vigorous program of education and self-improvement." Deming's desire for education will bring organizational maturity, and therefore change, in the culture.

I'm not trying to make the point that Deming was there before Reason, or Reason has bettered Deming, or talk about systems thinking. I am highlighting the point that applying a scientific approach will enhance your thinking and business philosophy, and the effects will be long-lasting. After all, it took 300 years before Einstein improved on Newton's theories of the physical universe.

Reason has been quoted as saying, "You cannot change the human condition; however, you can change the conditions that humans work in." Why not change your condition, and apply some education or self-improvement today? I highly recommend looking up James Reason's body of work.


About The Author

Paul Naysmith’s picture

Paul Naysmith

Paul Naysmith is the author of Business Management Tips From an Improvement Ninja and Business Management Tips From a Quality Punk. He’s also a Fellow and Chartered Quality Professional with the UK’s Chartered Quality Institute (CQI), and an honorary member of the South African Quality Institute (SAQI). Connect with him at www.paulnaysmith.com, or follow him on twitter @PNaysmith.

Those who have read Paul’s columns might be wondering why they haven’t heard from him in a while. After his stint working in the United States, he moved back to his homeland of Scotland, where he quickly found a new career in the medical-device industry; became a dad to his first child, Florence; and decided to restore a classic car back to its roadworthy glory. With the help of his current employer, he’s also started the first-of-its-kind quality apprenticeship scheme, which he hopes will become a pipeline for future improvement ninjas and quality punks.


Gorgonzola Vs. Emmental

Hi, Paul: I plaudit your latest column; and I would plaudit you more, had you quoted the celebrated Gorgonzola cheese: even John Steinbeck made a myth of it. Though I read your piece all too quickly - I apologize for that, but, at 5 o'clock a.m. local time ... I guess you understand - I share your point of view: all too often I turn up my nose at problem solving records claiming that the failure's root cause was human error. By definition any error is systemic, non systematic: were it systematic it would be a common, not special cause of variation. Thank you.

PS: Swiss cheese-makers ignore the role of Wine in aprreciating cheese; the French and Italians don't.