{domain:"www.qualitydigest.com",server:"169.47.211.87"} Skip to main content

        
User account menu
Main navigation
  • Topics
    • Customer Care
    • Regulated Industries
    • Research & Tech
    • Quality Improvement Tools
    • People Management
    • Metrology
    • Manufacturing
    • Roadshow
    • QMS & Standards
    • Statistical Methods
    • Resource Management
  • Videos/Webinars
    • All videos
    • Product Demos
    • Webinars
  • Advertise
    • Advertise
    • Submit B2B Press Release
    • Write for us
  • Metrology Hub
  • Training
  • Subscribe
  • Log in
Mobile Menu
  • Home
  • Topics
    • Customer Care
    • Regulated Industries
    • Research & Tech
    • Quality Improvement Tools
    • People Management
    • Metrology
    • Manufacturing
    • Roadshow
    • QMS & Standards
    • Statistical Methods
    • Supply Chain
    • Resource Management
  • Login / Subscribe
  • More...
    • All Features
    • All News
    • All Videos
    • Training

Don’t Allow Deviance to Become Normal

It’s how ships get blown up and workers lose fingers and hands

Royal Navy Archives

William A. Levinson
Bio

Levinson Productivity Systems

Wed, 04/08/2026 - 12:03
  • Comment
  • RSS

Social Sharing block

  • Print
Body

Most quality practitioners, as well as process engineers, are familiar with management of change (MOC). This means that any significant change to a process factor, such as the familiar ones in cause-and-effect diagrams like manpower, machine, material, method, measurement, and environment, can have unforeseen and undesirable consequences. This is why MOC reviews must take place prior to deploying the changes in question.

ADVERTISEMENT

However, in the Quality Digest webinar “Management of Change (MOC) for EHS: Preventing Incidents Before They Exist,” presenter Stephanie Ojeda raised the issue of changes that never get reviewed because they happen outside the organization’s formal MOC review process. As my notes taken during the webinar summarize, “Workarounds and shortcuts are a frequent cause of problems, especially when the workaround is effective and becomes normalized—the new way of doing the job.” 

The U.S. Army attributes the phrase normalization of deviance to sociologist Diane Vaughan. “[T]his phenomenon describes the gradual process where individuals, groups, or organizations tolerate unacceptable behaviors, practices, or conditions as normal.”

The principle, “How you do anything is how you do everything,” originating with Zen Buddhism and made famous in the John Wick movie series, is highly instructive. Inattention to seemingly minor details or issues can often carry over into more important ones. Normalization of deviance can result in catastrophic outcomes, as was seen in the sinking of HMS Queen Mary and HMS Invincible during the Battle of Jutland in 1916.

As stated by Admiral David Beatty, who would have been blown up himself had a mortally wounded Royal Marine officer not flooded HMS Lion’s magazines, “There seems to be something wrong with our bloody ships today.” What was wrong with Queen Mary, Invincible, and Indefatigable? All were struck by enemy shells, but these apparently didn’t reach the well-protected magazines directly. The shells penetrated turrets to set off the cordite propellant charges inside them, and the flash fires spread quickly to the magazines beneath them.

The danger was known, and countermeasures were in place to stop it; similar features probably saved the USS Iowa in 1989. American Military University Edge explains, “Unfortunately, naval leaders and the commanders of individual vessels allowed their fear of not having enough ammunition during battle to override these mechanical and procedural safeguards, with tragic results.”

Maybe the people involved thought the time required to open and close the magazine hatches to prevent the spread of a flash fire was nonvalue-adding—which it was indeed in the context of reloading the guns. They should have realized, though, that these safety features were there for a purpose. The problem was very similar to the idea of leaving a fire door open so that people, equipment, and work can go through more quickly. It’s obviously a bad idea.

Normalization of deviance also happens in workplaces. Henry Ford’s safety chief Robert A. Shaw introduced presses that required each worker to hold down two buttons, neither of which were anywhere near the moving parts of the press, to activate them. This made it physically impossible to close the press if a hand was inside it. If more than one person operated the machine, each essentially locked out the others from activating it until he pressed his own buttons. These controls, however, are sometimes bypassed to increase productivity and get the work out more quickly.

Normalization of deviance is how ships get blown up and workers lose fingers and hands:

Controls, Drives, and Automation reports: “One engineer injured his fingers when his hand became trapped after he defeated an interlocked door to get a better look at a wrapping fault.”

From Pinnacle Systems: “A facility received an OSHA citation after disabling an interlock switch to increase production speeds, leading to a serious worker injury.”

IOSH magazine reports in the context of a machinery-related fatality: “Inspectors found the spare keys were used regularly by staff who entered the enclosures while machinery was operating.” (IOSH is published by the U.K.’s Institution of Occupational Safety and Health.)

You only have to get it wrong once

The Heinrich safety pyramid and its successors illustrate the fact that normalization of deviance is like Russian roulette, albeit with a far lower likelihood of a bad outcome. This is, in fact, why deviance gets normalized. No rational person would play a game with a one in six chance of a fatality. But what about one in 100,000 or one in a million? This is exactly what happens when people take shortcuts or ignore warning signs. Nothing bad happens the first few dozen times, so people assume it never will. This might explain why some people persist in driving under the influence of alcohol; they have never caused an accident before, so they assume they never will.

People even ignore obvious warning signs. Foam shedding from the space shuttle Columbia was a known problem. But because nothing bad ever happened, the shuttle continued to be used until something bad did happen and we lost seven astronauts. Normalization of deviance was also implicated in the loss of the Challenger in 1986. The risks associated with the booster O-rings were cited prior to the disaster, but no corrective action was taken.

Mike Mullane, during the second part of the webinar with Ojeda, added that not fastening seat belts is another example of normalization of deviance. The chance of being involved in a high-impact accident is very low, so most of the time nothing bad happens. But, if it does, an unbelted driver or passenger could be easily killed or seriously injured.

Risks were also identified for the Titan submersible, whose subsequent implosion under enormous water pressure killed its crew and passengers. As reported by Grace Eliza Goodwin and Matthew Loh, “A submarine pilot hired to assess the now-missing Titan submersible warned in 2018 that its hull-monitoring system would only detect failure ‘often milliseconds before an implosion.’”

This also is why any safety-related or quality near-miss must be treated as if the incident actually happened. The fact that it could have happened is a warning that, if the situation is allowed to persist, it eventually will happen.

Suppose, for example, I almost trip over a hazard in a workplace but don’t report it. Maybe I’ll remember to avoid the hazard in the future. But maybe I won’t, or somebody who is unaware of it will suffer harm from it. Suppose a worker almost assembles a part backward but catches the mistake and remembers to pay closer attention in the future. The occurrence root cause—namely, the fact that it’s possible to assemble the part backward—is, however, allowed to remain, which means that sooner or later, one or more nonconforming parts will be produced.

Educate your workforce

Why do people use workarounds and shortcuts, or tolerate conditions that jeopardize quality or safety? There’s always a natural desire to do a job more efficiently and produce more parts per hour—or, in the case of the British crews at Jutland, load and fire their guns more rapidly. The receipt of well-aimed fire from their German counterparts was an even stronger incentive than civilian production quotas to take dangerous and, in the end, costly shortcuts. In the case of the space shuttles, planners often don’t like to scrub missions. But that is exactly what they have to do when they see a questionable situation.

Consider a simple scenario that involves production quotas, or a rush order from a customer that requires more parts per hour for a machining process. An obvious course of action is to increase the tool speed to remove material more rapidly. This temptation should be resisted for numerous reasons. Heat generation can cause excessive tool wear or change the part’s material properties. The reaction rate of a chemical process can often be increased by raising the temperature, but the higher temperatures could enable side reactions that generate undesirable byproducts. Even though the reaction is faster, the overall yield of the desired product may be lower, or contaminants may be introduced.

Process instructions and specifications are there for a reason, and a basic principle of standard work is that it’s exactly that: standard work. The job is done the same way every time, so the outcome is always predictable.

Why we must follow standards

Although we think of standard work as a relatively new concept, it’s actually 115 years old. Frederick Winslow Taylor described it clearly in Principles of Scientific Management, first published in 1911. Contrary to popular belief, Taylor didn’t want workers to leave their brains at the factory gate. But he also made it clear that they must not deviate from the established process in the absence of what we would now call an MOC assessment.

Taylor wrote: 
“It is true that with scientific management the workman is not allowed to use whatever implements and methods he sees fit in the daily practice of his work. Every encouragement, however, should be given him to suggest improvements, both in methods and in implements. And whenever a workman proposes an improvement, it should be the policy of the management to make a careful analysis of the new method and, if necessary, conduct a series of experiments to determine accurately the relative merit of the new suggestion and of the old standard. And whenever the new method is found to be markedly superior to the old, it should be adopted as the standard for the whole establishment.”

The key takeaways here are:
• Workers must follow the established procedures, and with the specified materials, equipment, and process settings. No deviations are permitted.
• If somebody thinks there is a better way to do a job, he or she is encouraged to bring it to the attention of the process owner or other responsible party.
• The organization then follows its established MOC process to ensure that the change doesn’t have any undesirable side effects. If so, the procedure or work instruction is updated to reflect the improvement, and it becomes the new “one best way” or “best known way” to do the job.

The last point is another reason people aren’t allowed to make unsupervised changes. There was a time when, if workers were left to their own devices, some would develop “knacks,” or specialized and arcane ways to do a job more quickly. These generally, although not universally, did not cause MOC issues. If, however, the knack didn’t become part of what we now call organizational knowledge, it remained unique to the worker involved. When these workers left, retired, or died, the knowledge was lost until somebody else rediscovered it.

Why we must control all changes

We all know that Frank Gilbreth realized that bricks and mortar should be delivered to masons at waist level so they wouldn’t have to bend over to pick them up. This increased productivity from 125 to 350 bricks per hour, and with less physical effort. A medieval painting, however, shows bricks and mortar being supplied at waist level, which suggests that the painter saw bricklaying being done in exactly this manner. “Construction of the Tower of Babel” in the Maciejowski Bible even features an assistant with a bowl of mortar on his back on a ladder he apparently climbs to keep it level with the wall so the bricklayer doesn’t need to bend over to get it.

 Because the method never became standardized, though, millions if not billions of worker-hours were lost to wasted motion over the intervening centuries. Taylor noted very clearly the enormous waste of human labor and, by implication, consequent poverty of the workers and exorbitant costs of construction. “Think of the waste of effort that has gone on through all these years, with each bricklayer lowering his body, weighing, say, 150 pounds, down two feet and raising it up again every time a brick (weighing about 5 pounds) is laid in the wall! And this each bricklayer did about one thousand times a day.”

Gilbreth proved it was possible to lay 350 bricks per hour with a nonstooping scaffold, while the rate was 125 an hour without it. The inferior job design wasted more than 64% of the bricklayer’s working life.

Summary

The familiar issue of management of change relates to the need to review planned changes to ensure that they don’t have unintended and undesirable consequences. These changes are planned and approved, regardless of the adequacy of MOC reviews. Normalization of deviance, on the other hand, relates to shortcuts, workarounds, and other forms of noncompliance with established standards and processes. These actions aren’t supervised, approved, or subjected to MOC reviews. Everybody in the organization needs to realize that, even if normalization of deviance delivers apparent benefits, they often are illusory, short-lived, or both.

Add new comment

The content of this field is kept private and will not be shown publicly.
About text formats
Image CAPTCHA
Enter the characters shown in the image.

© 2026 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute Inc.

footer
  • Home
  • Print QD: 1995-2008
  • Print QD: 2008-2009
  • Videos
  • Privacy Policy
  • Write for us
footer second menu
  • Subscribe to Quality Digest
  • About Us