Featured Product
This Week in Quality Digest Live
Lean Features
Del Williams
8-in. cable and disc systems are comparable to belt or bucket systems
Jeremy L. Boerger
To keep your business running, you need visibility into your IT assets
Kevin Ketels
The baby formula industry was primed for disaster long before a key factory closed down
Joe Vernon
The greatest advantage of CV is its ability to count and categorize inventory
James J. Kline
Quality professional organizations need to adjust their body of knowledge to include an understanding of big data

More Features

Lean News
Sept. 28–29, 2022, at the MassMutual Center in Springfield, MA
Enables system-level modeling with 2D and 3D visualization, reducing engineering effort, risk, and cost
It is a smart way to eliminate waste and maximize value
Simplified process focuses on the fundamentals every new ERP user needs
DigiLEAN software helps companies digitize their lean journey
Partnership embeds quality assurance at every stage of the product life cycle, enables agile product introduction
First trial module of learning tool focuses on ISO 9001 and is available now
Offset-aware programming of spindle transfers and bar pulls helps manufacturers drive multichannel CNC machinery
Freedom platform connects to any industrial asset to provide automated intelligence related to asset availability, utilization, and continuous improvement

More News

William A. Levinson

Lean

Why Variation Matters to Everybody

We should make variation accessible to a wider spectrum of professionals

Published: Monday, July 30, 2018 - 12:03

Quality and manufacturing practitioners are most familiar with the effect of variation on product quality, and this is still the focus of the quality management and Six Sigma bodies of knowledge. Other forms of variation are, however, equally important—in terms of their ability to cause what Prussian general Carl von Clausewitz called friction, a form of muda, or waste—in production control and also many service-related activities. This article shows how variation affects the latter as well as the former applications.

W. Edwards Deming’s Red Bead experiment was an innovative, hands-on exercise that demonstrated conclusively the drawbacks of blaming or rewarding workers for unfavorable and favorable variation, respectively, in a manufacturing process. The exercise consisted of using a sampling paddle to withdraw a certain number of beads from a container. White beads represented good parts, and red beads nonconforming parts. Results for five workers might, for example, be as follows for 200 parts, of which 3 percent are nonconforming. (You can simulate this yourself with Excel by means of the Data Analysis menu and Random Number Generation. Use a binomial distribution with 200 as the number of trials, and nonconforming fraction p = 0.03.)


Figure 1: Red Bead experiment

Suppose management does the obvious by firing workers three and five, promoting worker four, and telling workers one and two to try to do better. The company will go through workers very quickly, including those recently promoted for their “good work,” without improving quality. Participants realized very quickly that the yield, or percentage of white beads, depended not on the worker’s skill or diligence but rather on the fraction of red beads in the container.

I and others have meanwhile used simulated gun targets, in which the boundaries of the target represent the product’s specification limits, to illustrate the same principle for continuous-scale (i.e., variables) data. A smoothbore musket represents a noncapable process that will often miss the target, regardless of the skill of the operator. The manager could put somebody like Annie Oakley or Simo Häyhä (the Finnish sniper known as The White Death) behind a musket and get completely mediocre results, as shown by figure 2. An Olympic match rifle, on the other hand, can represent a Six Sigma process that will miss only twice, on average, out of every billion shots if it is centered on the bull’s-eye (i.e., nominal).


Figure 2: Effect of variation on quality

These instructional methods relate, however, to manufacturing applications for which industrial statistics and Six Sigma were primarily designed. Many Six Sigma techniques are entirely usable for services, but people who work in service industries are unlikely to relate to simulated defects in manufactured parts, gun targets that illustrate the effect of variation on the ability to meet specification limits, or process performance indices that quantify the latter. We therefore need to make the concept of variation accessible to a wide spectrum of professionals, including those in project management and service industries.

Variation and production control

Eliyahu Goldratt’s and Jeff Cox’s The Goal (North River Press, 2014 reprint) illustrated the effect of variation, not in product dimensions but rather in processing times, on production. An exercise in which dice are used to simulate a given time period’s production does for production control what the Red Bead experiment did for quality. Participants expected production to be roughly 3.5, or the average of a single die roll, but it was substantially less, while bubbles of inventory accumulated between workstations. The reason is that favorable variation cannot offset unfavorable variation when the factory is operating at full capacity.

Suppose, for example, the die roll is five, but only three parts are waiting for the operation. The unused capacity of two units is lost forever and can’t be used later to offset a low die roll when a lot of parts are waiting.

Variation and traffic jams

The same underlying root cause—i.e., the fact that favorable variation does not offset unfavorable variation—explains how traffic jams appear out of nowhere during rush hour. “A simple experiment shows that when the density of vehicles on a road passes a certain threshold, traffic jams emerge because of fundamental instabilities inherent in multiparticle interactions,” notes Dennis Normille in “Traffic Jams Happen, Get Used to It.” The “certain threshold” almost certainly relates to the road’s capacity, with any decrease in a driver’s speed propagating backward. Autonomous vehicles and adaptive cruise controls could alleviate the problem by simply reducing the human-induced variation. (In any event, there is an audiobook version of The Goal to which one can listen while experiencing the corresponding effect of variation on the highway rather than in the factory.)

Variation and project management

Goldratt’s Critical Chain (Routledge, 2017) applied the same concept to project management. If the project has two critical paths, there is only a 25-percent chance that both will finish early or on time. There is a 50-percent chance that one will finish early and one late, but because the favorable variation from the one can’t be used to offset the unfavorable variation from the other, the project will finish late.

The same issue carries over into complex bills of materials (BOMs) in which the absence of any item will make it impossible to do the job. Henry Ford recognized this long ago, when he wrote of the need for protective inventory: “If transportation were perfect and an even flow of materials could be assured, it would not be necessary to carry any stock whatsoever.... With bad transportation one has to carry larger stocks.” If the items arrive on time, on average, there will invariably be shortages of some items, and surpluses of others, unless the variation in delivery times can be made negligible—and Ford made every conceivable effort to do this.

Variation and services

Here is a simulation of how variation might affect a service, such as checking in at a hotel desk or rental car agency. Management is upset because even though the job is designed to serve exactly three customers every half-hour, and an average of three customers do arrive every half-hour, there often seems to be customers waiting despite the average service rate being well under three every half-hour. It doesn’t seem to matter who they assign to the service desk, either; the results are always substandard. It is quite simple to use Excel to simulate a Poisson arrival process with a mean, and therefore a variance, of three. Here is an example of 30 periods during which an average of 2.93 customers arrived in each period, and only 2.87 were served.


Figure 3: Service with fixed processing rate and random arrivals

The graph reads from back to front, with the 30th and last period being in front. Although 2.93 customers arrive during each period on average, there are sometimes as many as five (of which the service can process only three) and as few as zero. If four or more customers arrive during any period, or even two or three when enough are waiting from the prior period, there will still be customers waiting. This is reflected by the green bars, and they show that as many as three will be waiting.

If on the other hand nobody is waiting, and two or fewer arrive, some of the period’s capacity will go unused. This unused capacity is represented by the blue bars on the right. As with Goldratt’s matchsticks and dice production simulation, the unused capacity can’t be moved to a period when it is needed; time lost at the constraint is lost forever. The wasted capacity is why the operation serves fewer customers per period than the number that arrive.

In another simulation, the average number of arrivals exceeded the Poisson mean, and also the service’s capacity to handle them. The service was then able to deliver its full capacity of three per period, but lines of dissatisfied customers backed up through the door.

This simulation is easy to perform because it assumes there is no variation in the processing rate itself. Matters get a lot worse when the processing time follows the exponential distribution, in which case the number of customers served per unit of time also follows the Poisson distribution. Given:
• Poisson arrival rate of λ customers per unit time
• Average service rate (Poisson mean) of μ customers per unit time
• Lq = number of customers waiting (does not include the one being served)
• Wq = expected time that customers must wait

In Quantitative Analysis for Business Decisions (Irwin Books, 1991), authors Harold Bierman, Charles Bonini, and Warren Hausman give the following equations:

It is obvious that, when μ = λ, i.e., there is no excess capacity at all, queue lengths and waiting times will become infinite. Suppose our service depicted above can serve four people an hour, and the arrival rate is only three per hour, but both the arrivals and number served follow a random Poisson distribution. Even though we are operating at only 75 percent of capacity, Lq = 32/(4(4–3)) = 2.25 and Wq = 3/(4(4–3)) = 0.75. This scenario was simulated, and the average arrival rate came to only 2.5 customers per hour while the average simulated capacity was 3.77, which means the utilization was only 66 percent, but there were nonetheless six people waiting during one period.

Figure 4 applies the above equations to show the expected number of people waiting, and also how long they wait. Even when the mean Poisson arrival rate is 3.2, for a utilization of only 80 percent, the expected queue length is 3.2, and the expected waiting time is one hour. Matters get much worse at 90-percent capacity, when the waiting time grows to 2.25 hours, and at which point customers are likely to start walking out in disgust (if they have somewhere else they can go).


Figure 4: Length of queue, and waiting time in queue

Management might be upset that the service desk is working at only 75-percent capacity, or even less, while customers are meanwhile complaining about having to wait, but variation is clearly the root cause. Management can’t do anything about the customer arrival rate, but standardization of the service process, i.e., standard work, might be able to reduce variation in the service time. The process would of course have to account for variation brought in by the customers themselves in the form of different requirements. Another remedy is to increase the number of people at the service desk in response to demand, which is how stores often handle backups at checkout counters.

The same model would incidentally apply to Goldratt’s matchsticks and dice exercise for a single workstation if we use a simulated Poisson distribution rather than a six-sided die. That is, production releases have a Poisson mean of λ while the number processed has a Poisson mean of μ. If we try to run even one workstation, let alone a series in a balanced factory, at close to 100-percent capacity, the results will not be pretty.

We don’t have to live with processing variation

The scenario extends to manufacturing, and Goldratt’s drum-buffer-rope (DBR) production control system, because the only way to ensure that the constraint or capacity-constraining resource (CCR) never runs out of work is to keep a buffer of inventory moving toward it. Parts, unlike people, do not mind waiting, although inventory—to which cycle time and therefore lead time are proportional—is one of the Toyota Production System’s seven wastes.

An unintended takeaway from the matchsticks and dice exercise is, however, the belief that variation in processing or material transfer times is random or common cause variation about which nothing can be done. The matchsticks and dice exercise teaches, in fact, that it is impossible to run a balanced factory at full capacity. However, in My Life and Work (Digireads.com Publishing, 2009, from 1922 original), Henry Ford claims to have done exactly that: “The idea is that a man must not be hurried in his work—he must have every second necessary but not a single unnecessary second.” Ford’s use of a moving assembly line with sequential operations suggests further that he was actually running a balanced factory—one in which each operation has the same capacity—at close to 100-percent capacity.

The only way he could have done this was to have reduced the variation to essentially zero. This was achieved partially by eliminating nonvalue-adding tasks that added cycle time (and variation) but no value. Ford added, “The man who places a part does not fasten it—the part may not be fully in place until after several operations later. The man who puts in a bolt does not put on the nut; the man who puts on the nut does not tighten it.” If each worker did the entire task of putting on and tightening the nut, he would have to pick up and put down the wrench each time; a non-value-adding and non-job-enriching task that constitutes pure muda.

In Captains Courageous Rudyard Kipling depicted how fishermen subdivided the task of disassembling (cleaning) a fish so nobody would have to pick up and put down a knife for each one. They also used a work slide (or the equivalent) to achieve a single-unit flow process in which “the cod moved along as though they were alive.” Automation, of course, reduces the variation even further because machines can be made to work at a steady rate.

The ideal is to achieve continual single-unit flow, the kind that approximates the flow of liquids and gases in a chemical plant. There is nothing good about batch-and-queue operations that result in parts having to wait—and “wait” is definitely a four-letter word to lean practitioners—to form processing and transfer batches. The bottom line, however, is that much of the variation that impedes productivity comes from special or assignable causes for which DBR is containment rather than correction. Removing variation from other activities such as services can meanwhile increase capacity utilization and customer service simultaneously.

Discuss

About The Author

William A. Levinson’s picture

William A. Levinson

William A. Levinson, P.E., FASQ, CQE, CMQOE, is the principal of Levinson Productivity Systems P.C. and the author of the book The Expanded and Annotated My Life and Work: Henry Ford’s Universal Code for World-Class Success (Productivity Press, 2013).