



© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Published: 12/09/2013
Thanks to Twitter, again, I recently came across one of the most insightful articles I’ve read in a long time—on a safety blog, no less. Steven Shorrock takes on the concept of “human error” and adds considerably more perspective to this oft-used term.
“In the aftermath of the [Spain rail] accident, initial investigations ruled out mechanical or technical failure, sabotage and terrorism,” Shorrock writes. “That appeared to leave only two possible explanations—‘human error’ or ‘recklessness,’ or both. When society demands someone to blame, the difference—whatever it might be—can seem trivial. What followed was a display of our instinct to find a simple explanation and someone to blame.”
When something bad happens—an accident like this, an inappropriate comment, a disastrous roll out of a website—we seek action, fast. We want accountability, and usually by that we mean rolling heads. Rarely do we probe further into the context of the event, and especially into the system. As just one example:
“Several claims appeared about the driver in the media, often without relevant context,” Shorrock says. “It was reported that the driver ‘admitted speeding’ on the occasion of the crash. However, there appears to be no evidence that the ‘speeding’ involved conscious disregard for, or indifference to, the dangers of the situation or for the consequences of his actions. This would have been an extreme act. Rather, it seems that the driver was unaware of the context.”
Shorrock goes on to bemoan the effect of overuse of the term “human error.”
“Indeed, the popularization of the term ‘human error’ has provided perhaps the biggest spur to the development of human factors in safety-related industries—with a downside,” he says. “When something goes wrong, complexity is reduced to this simple, pernicious, term. ‘Human error’ has become a shape-shifting persona that can morph into an explanation of almost any unwanted event. It is now almost guaranteed to be found in news stories pertaining to major accidents.”
And then Shorrock eventually comes up with the reasons why he is abandoning the term:
• “‘Human error’ is a[n] often a post-hoc social judgement. ‘Human error’ is one of few things that often cannot be defined unambiguously in advance of it happening.
• ‘Human error’ requires a standard. To know that something is an error, it must be possible to describe a non-error. This can be surprisingly difficult, partly because there are so many ‘it depends.’ In the context of complex interacting systems such as ATC, there are many ways to get an acceptable result.
• ‘Human error’ points to individuals in a complex system. In complex systems, system behavior is driven fundamentally by the goals of the system and the system structure. People provide the flexibility to make it work.
• ‘Human error’ stigmatizes actions that could have been heroic in slightly different circumstances. What are described as heroic actions could often have been described as tragic errors if the circumstances were only slightly different. The consequences of heroic actions are not known in advance.
• Underlying processes of ‘human error’ are often vital for task performance. In the context of error, we often refer to psychological activity involved in perception, memory, decision making, or action. Taking one example, without expectation, radio-telephony would be very inefficient. Occasionally, one may hear what one expects instead of what is said, but this must be set against improved efficiency during thousands of other occasions.
• ‘Human error’ is an inevitable byproduct of the pursuit of successful performance in a variable world. The context and conditions of performance are often vague, shifting, and suboptimal. The ability to adapt and compensate comes at a cost.”
The need for standards, the importance of context, the fine line between error and heroism, and perhaps even the positive aspects of error when pushing boundaries.
So what does Shorrock now use instead of “human error?”
“Left with a ‘human error’-shaped hole in my vocabulary several years ago, I found an alternative concept thanks to Erik Hollnagel: performance variability. This is not simply a replacement term or a euphemism, but a new way of thinking that acknowledges how systems really work. Performance variability, both at an individual level and at a system or organizational level, is both normal and necessary, and it is mostly deliberate.”
Something to think about next time an incident happens and you feel that instinctual urge to blame instead of understand.
First published Dec. 7, 2013, on Evolving Excellence.
Links:
[1] http://www.safetydifferently.com/the-use-and-abuse-of-human-error/
[2] http://www.evolvingexcellence.com/blog/2013/12/about-that-human-error.html