Our PROMISE: Our ads will never cover up content.
Our children thank you.
Kevin Meyer
Published: Monday, December 9, 2013 - 18:00 Thanks to Twitter, again, I recently came across one of the most insightful articles I’ve read in a long time—on a safety blog, no less. Steven Shorrock takes on the concept of “human error” and adds considerably more perspective to this oft-used term. “In the aftermath of the [Spain rail] accident, initial investigations ruled out mechanical or technical failure, sabotage and terrorism,” Shorrock writes. “That appeared to leave only two possible explanations—‘human error’ or ‘recklessness,’ or both. When society demands someone to blame, the difference—whatever it might be—can seem trivial. What followed was a display of our instinct to find a simple explanation and someone to blame.” When something bad happens—an accident like this, an inappropriate comment, a disastrous roll out of a website—we seek action, fast. We want accountability, and usually by that we mean rolling heads. Rarely do we probe further into the context of the event, and especially into the system. As just one example: “Several claims appeared about the driver in the media, often without relevant context,” Shorrock says. “It was reported that the driver ‘admitted speeding’ on the occasion of the crash. However, there appears to be no evidence that the ‘speeding’ involved conscious disregard for, or indifference to, the dangers of the situation or for the consequences of his actions. This would have been an extreme act. Rather, it seems that the driver was unaware of the context.” Shorrock goes on to bemoan the effect of overuse of the term “human error.” “Indeed, the popularization of the term ‘human error’ has provided perhaps the biggest spur to the development of human factors in safety-related industries—with a downside,” he says. “When something goes wrong, complexity is reduced to this simple, pernicious, term. ‘Human error’ has become a shape-shifting persona that can morph into an explanation of almost any unwanted event. It is now almost guaranteed to be found in news stories pertaining to major accidents.” And then Shorrock eventually comes up with the reasons why he is abandoning the term: The need for standards, the importance of context, the fine line between error and heroism, and perhaps even the positive aspects of error when pushing boundaries. So what does Shorrock now use instead of “human error?” “Left with a ‘human error’-shaped hole in my vocabulary several years ago, I found an alternative concept thanks to Erik Hollnagel: performance variability. This is not simply a replacement term or a euphemism, but a new way of thinking that acknowledges how systems really work. Performance variability, both at an individual level and at a system or organizational level, is both normal and necessary, and it is mostly deliberate.” Something to think about next time an incident happens and you feel that instinctual urge to blame instead of understand. First published Dec. 7, 2013, on Evolving Excellence. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Kevin Meyer has more than 25 years of executive leadership experience, primarily in the medical device industry, and has been active in lean manufacturing for more than 20 years serving as director and manager in operations and advanced engineering, and as CEO of a medical device manufacturing company. He consults and speaks at lean events; operates the online knowledgebase, Lean CEO, and the lean training portal, Lean Presentations; and is a partner in GembaAcademy.com, which provides lean training to more than 5,000 companies. Meyer is co-author of Evolving Excellence–Thoughts on Lean Enterprise Leadership (iUniverse Inc., 2007) and writes weekly on a blog of the same name.About That ‘Human Error’
When social judgement meets performance variability
• “‘Human error’ is a[n] often a post-hoc social judgement. ‘Human error’ is one of few things that often cannot be defined unambiguously in advance of it happening.
• ‘Human error’ requires a standard. To know that something is an error, it must be possible to describe a non-error. This can be surprisingly difficult, partly because there are so many ‘it depends.’ In the context of complex interacting systems such as ATC, there are many ways to get an acceptable result.
• ‘Human error’ points to individuals in a complex system. In complex systems, system behavior is driven fundamentally by the goals of the system and the system structure. People provide the flexibility to make it work.
• ‘Human error’ stigmatizes actions that could have been heroic in slightly different circumstances. What are described as heroic actions could often have been described as tragic errors if the circumstances were only slightly different. The consequences of heroic actions are not known in advance.
• Underlying processes of ‘human error’ are often vital for task performance. In the context of error, we often refer to psychological activity involved in perception, memory, decision making, or action. Taking one example, without expectation, radio-telephony would be very inefficient. Occasionally, one may hear what one expects instead of what is said, but this must be set against improved efficiency during thousands of other occasions.
• ‘Human error’ is an inevitable byproduct of the pursuit of successful performance in a variable world. The context and conditions of performance are often vague, shifting, and suboptimal. The ability to adapt and compensate comes at a cost.”
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Kevin Meyer
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Comments
An old story
It's an old story that blaming must come before understanding; to the point that non-teleological approaches are mostly unknown. And if we look at many, many FMEA's records, we find that "human error" takes the lion's share of failure causes. It is however to be understood whose "human" is the error: the operator, or who's above him, selected him, trained him, controls him?
Open Systems
Hello Kevin:
Terrific article. I hope readers see that it is possible to inappropriately apply the concept of blaming human error to any "error" in a system, safety or otherwise.
Please will you explain what the following means? "Taking one example, without expectation, radio-telephony would be very inefficient."
Since all system are inevitably open, variability is always exist. We can create momentarily closed systems, but I am not aware of a system that is always or permanently a closed system. Variation is inevitable.
Finally, I think I heard the following quote from Russell Ackoff. We see what we thought before we looked. This is another way of saying that we hear what we expect to hear.
Thank you, Dirk
Myron Tribus
It was Myron Tribus that said we see what we thought before we looked. It was not Russ Ackoff.