Featured Product
This Week in Quality Digest Live
Management Features
Andy J. Yap
When organizations merge, people must come together
Gene Russell
Resources to help increase your financial literacy
Michael King
Augmenting and empowering life-science professionals
Meg Sinclair
100% real, 100% anonymized, 100% scary
Mike Figliuolo
The customer isn’t always right

More Features

Management News
For companies using TLS 1.3 while performing required audits on incoming internet traffic
Accelerates service and drives manufacturing profitability
New video in the NIST ‘Heroes’ series
A tool to help detect sinister email
Developing tools to measure and improve trustworthiness
Manufacturers embrace quality management to improve operations, minimize risk
How well are women supported after landing technical positions?

More News

Anil Gaba


Question the Consensus of Experts

Just because all experts agree, doesn’t make them right

Published: Tuesday, January 24, 2017 - 12:03

Whether predicting demand for a product or forecasting spot prices for a resource or currency, we invariably seek out subjective opinions—expert viewpoints—to assist in the information-gathering process in order to make informed decisions. If forecasters are too closely linked, less information can be gleaned from their opinions, and decision-makers are more likely to make costly mistakes.

At times a decision maker may have access to plenty of relevant historical data on which to establish robust statistical models. However, in many instances, even with such data, an overlay of human judgment is inevitable to counter ever-changing conditions. In predicting the demand for a new fashion product, for example, it is important to account for not only issues such as rapidly changing tastes and competing products, but also the possibility of inducing demand for the product that might not otherwise exist.

Thus, even with the advent of new approaches such as artificial neural networks, fuzzy logic and machine learning, human judgment remains a key element in making predictions.

The impact of experts’ interrelationships

It makes sense to assume that the more judgments and viewpoints we seek, the more information we will amass and the more informed our final decision will be. Although it is tempting to give greater weight to the opinions of experts who we feel are more trustworthy or have a higher level of expertise, research suggests that simply averaging the collective opinion of a group of individuals will give a much more robust finding. In other words, unless you have a very good reason to believe otherwise, you should give equal weight to all experts, ignoring their levels of experience or knowledge.

What cannot be ignored, however, is the level of dependence between their opinions or forecasts. If two or more experts share similar information, use similar tools and techniques, talk to each other and generally move in the same circles, they will tend to have very similar opinions, and the amount of information garnered from their insights may be considerably less than could be expected. There is also a greater chance their forecast range will underestimate uncertainty.

The risk of too little information

The range, or spread, between individual subjective forecasts (which often come in the form of point forecasts) gives decision makers an indication of how uncertain the final price or demand for a product might be. When predictions are more spread out, it could be surmised that the uncertainty about the underlying quantity of interest is higher. On the other hand, as the forecasts become more closely clustered, businesses may speculate that there is consensus and, hence, less uncertainty.

However, this fails to take into consideration the effect of the level of correlation between the experts. If there is a high dependency amongst forecasters, businesses could find they are working with far less information than they see. The opinion of 10 similar experts, for example, may give the same information as that of two or three independent sources, and the spread of their individual predictions may considerably understate uncertainty, thus increasing the likelihood of businesses making costly mistakes.

Avoiding costly mistakes

To address this, we developed a new parsimonious and practical approach, building upon previous work on combining opinions that not only takes into consideration the correlation between experts but also minimizes the estimation of model parameters to just one. In “Assessing Uncertainty From Point Forecasts,” Dana Popescu, INSEAD professor of operations management; Zhi Chen, a doctoral student in decision sciences at INSEAD; and I test our model against existing methods.

The study used the example of a news vendor faced with the typical problem of identifying how much stock to order ahead of the selling season without knowing demand. If the vendor under-orders it will forgo potential profits, if it over-orders it will be left with inventory that goes to waste—losing money. Taking into account the unit cost, selling price, and salvage value, we compared the accuracy of the different approaches by testing the impact of each one on the order quantity and expected profit. What we discovered was that although many existing methods tended to be overconfident in their assessments, our approach (which took into account the correlation between point forecasts) erred on the side of caution. It resulted in orders that were biased in a less costly direction, and led to an increase in expected profits which, in some cases, exceeded 20 percent.

More informed decisions

Clearly, simply ignoring the dependence among experts is not a good option. Despite all efforts to create a group of independent experts, some form of dependence between their forecasts is inevitable. Previous research has noted an average correlation between business sales forecasts by managers of 0.6, while other research included the observation that a more articulate and assertive participant in a forecasting deliberation process could sway colleagues to such an extent that the final decision represented their preferences rather than the collective wisdom.

Loss of information due to dependence between experts cannot be overcome by simply increasing the number of experts, even to an extreme. To better assess uncertainty, it is important business managers assess the correlation of experts when weighing up information, opening the way for better and more informed decisions.

This article is republished courtesy of INSEAD Knowledge. © INSEAD 2016.


About The Author

Anil Gaba’s picture

Anil Gaba

Anil Gaba is a Professor of Decision Sciences, the Orpar Chaired Professor of Risk Management and Academic Director at the Centre for Decision Making and Risk Analysis at INSEAD.

He has appeared in several academic journals such as Management ScienceOperations ResearchMarketing Science, andJournal of Risk and Uncertainty. He is a co-author (with S. Makridakis and R. Hogarth) of the book Dance with Chance: Making Luck Work For You.

Anil teaches courses on Uncertainty, Data, and Judgment (MBA), Probability and Statistics (PhD), and Bayesian Analysis (PhD). In addition, he teaches modules on Judgments & Decision Making and Risk Management in several executive development programmes all over the world. He has won the Outstanding Teacher Award INSEAD MBA Core Course (Uncertainty, Data, and Judgment) ten times.