Featured Product
This Week in Quality Digest Live
Quality Insider Features
Jeff Dewar
After 40 years of publishing, Quality Digest is in a very good place
Scott Paton
From scrappy underdog to premier destination for all things quality
W. Edwards Deming
More than 40 years later, has much changed? What do you think?
Taran March @ Quality Digest
Quality has become invisible. Because it’s everywhere.
Quality Digest
Paying it forward

More Features

Quality Insider News
User friendly graphical user interface makes the R-based statistical engine easily accessible to anyone
Unrivaled quoting system delivers new paradigm in streamlined sourcing and process visibility for North American customers
Project begins with a unique QUANTRON design for its 12-meter bus
Feature uses special control algorithms to carry out a step-by-step adjustment of the ram height
Modular format makes learning about surface roughness, finish, friction, and wear more affordable and accessible
Successful manufacturers will need to be adaptive, resilient, and forward-thinking
Make comparative measurements of very small bores ranging from 1 mm to 20 mm
Ideal in OEM applications where precise, level measurement and real-time data communication are critical
New and experienced users can easily obtain advanced measurements

More News

Minitab LLC

Quality Insider

Starbucks Wait Times and Process Capability

The probability of waiting more than five minutes is high

Published: Friday, December 14, 2012 - 16:29

If you’re in line for a coffee at the local Starbucks, analysis conducted by graduate students at Rutgers University suggests that the probability of waiting more than five minutes for your tall, hot, three-pump, sugar-free vanilla, one-pump mocha, half-soy, half-nonfat latte with whip is very high.

Brandon Theiss and Matthew Brown used a reliability engineering project to combine their passions for Starbucks’ coffee and gathering and analyzing data with Minitab Statistical Software.

Theiss drew on his work experience in crafting the study. Currently a principal industrial engineer at Medtronic, he previously was a Six Sigma Master Black Belt at American Standard Brands, and a systems engineer at Johnson Scale Co. In 2010, the American Society for Quality named him one of its Top 40 Leaders in Quality Under 40.

“Virtually anything can be characterized as a process and measured,” Theiss says. “Once you have the data, you can use a tool like Minitab to draw conclusions and hopefully improve the process.”

Although many of their classmates simply analyzed existing data, Theiss had a personal motivation for collecting real data. “I selected Starbucks to study because I am quite an addict,” he admits. “However, going to Starbucks is a very common experience, so it’s something everyone can relate to.”

When customers visit a Starbucks, they expect a consistent experience in terms of both their beverage and the time required to receive it. The team defined “meeting customers’ expectations” as receiving their beverages in less than five minutes. Then, to see if the national Starbucks experience would be delivered at arbitrarily selected Starbucks locations, Theiss and Brown chose two Starbucks stores in New Jersey, one in Marlboro, and the other in New Brunswick.

Brown collected data in the Marlboro store for three hours, while Theiss sat in the New Brunswick location for four hours. Each set up a laptop and used a simple stopwatch application to record customer arrival and wait times in Excel. Starbucks “public café” culture made it easy to gather data without attracting attention. “I spend so much of my time in the New Brunswick Starbucks that I am seen as furniture, so I went undetected,” Theiss says. “I believe Matt received a few awkward glances.”

After gathering their data, they used Minitab to analyze it. They subjected the frequencies of arrivals to a goodness-of-fit test for the Poisson distribution. In theory, Poisson-distributed arrivals typically experience Gamma-distributed wait times. The team then tested how well their wait-time data fit the normal, gamma, and Weibull distributions, both to validate the theoretical assumption and to account for potential confounding by the beverage-making process.

Once they confirmed the wait time distribution, the team performed a process capability analysis for each location, correcting for biased data due to small sample size. Finally, they used individuals and moving range control charts to evaluate whether the beverage delivery process was in statistical control.

The process capability analysis for the 94 wait-time measurements collected from the Marlboro data had a very low process performance (Ppk) value, which implies a process that is not capable of meeting the five-minute upper specification limit. Another interesting statistic in this analysis is the parts per million (PPM) value. Analysis of the Marlboro data implies that for every 1 million customers entering, 127,306 will not receive their beverage in less than five minutes.

The analysis of the 198 wait time measurements collected from the New Brunswick location yielded a Ppk of 0.13 for the gamma model, again implying a process that is not capable. The PPM value implies that more than one out of every four customers will wait longer than expected to receive their beverage.

Next the team assessed whether a statistically significant difference existed between the wait times at the two locations. Because they had sufficient evidence to believe that the underlying distributions were non-normal, Brown and Theiss used a Kruskal-Wallis comparison test, which does not assume that the data are normally distributed. The low p-value from the Kruskal-Wallis test indicated that there was a significant difference between wait times at the two locations, with New Brunswick taking the longest. The data set supports the conclusion that the location a customer visits has a significant effect on the time they will wait for their beverage, with neither location meeting the expected five-minute maximum wait time.

Of course, Theiss notes, this study had many limitations. “Our data set was small and far from comprehensive,” he says. “Both the New Brunswick and Marlboro data were collected for a rather short duration, on a single day. A more complete analysis would include a long data-collection window, which would allow the model to include factors such as the time of day, day of the week, and even time of the year. In addition the data set did not include the number or type of drinks ordered, but these data would be difficult to collect without the assistance of Starbucks.”

Although the mean time-to-beverage was less than five minutes for all scenarios analyzed, the probability of waiting more than five minutes is still very high. This implies that thirsty, caffeine-craving customers are willing to wait what would appear to be a long time to receive their beverage. These findings certainly haven’t reduced the frequency of his own visits to Starbucks, Theiss says. “Unfortunately, I still go there several times a day, and I am on a first-name basis with the baristas at three different locations.”


About The Author

Minitab LLC’s picture

Minitab LLC

Minitab Inc. develops statistical analysis and process improvement software for academic and commercial users. Minitab Statistical Software, a data analysis tool for businesses, has been used to implement virtually every major Six Sigma quality improvement initiative, and to teach statistics in more than 4,000 colleges and universities worldwide. Quality Companion is used to plan and execute Six Sigma projects. Use the subscription-based online learning course, Quality Trainer, to learn or refresh your knowledge and to access quality statistics expertise anytime. Interactive lessons based on real problems make concepts easy to retain.


Bad example of a capability study

What a pity that there was NO further analysis whether this is a stable process (control chart), nor it was indicated that it was a homogeneous process. The basic mistakes for performing a capability study.

In the complete analysis

In the complete analysis which I will gladly send you a copy of, there are several control charts which represent that the process is stable over the observed time period.

In regard to homogeneity, the premise was that an arbitrary person walking into the given Starbucks. Their experienced wait times based upon the data collected were fairly well modeled by a gamma distribution.  Clearly from the small sample size there are confounding factors that were not studied as it was not the intent to be a comprehensive study. It was merely to serve as an example that the same tools techniques and methods that are used to study complex manufacturing systems can also be used for something as apparently trivial as waiting in line at Starbucks.

Further Analysis

Would you be able to send me a copy of your full data. I would love to take a deeper look into this in my spare time.


That really seems like a lot of technically correct overanalysis.

How does any of that analysis fix anything? I guess that's why I've never gravitated toward Six Sigma.