Featured Product
This Week in Quality Digest Live
Management Features
Etienne Nichols
How to give yourself a little more space when things happen
Gleb Tsipursky
The future of work is here, and AI is the driving force
William A. Levinson
Quality and manufacturing professionals are in the best position to eradicate inflationary waste
Chandrakant Isi
Experts in design and manufacturing describe the role of augmented and virtual reality
Gleb Tsipursky
These successful practices will help address DEI issues for remote employees

More Features

Management News
Recognition for configuration life cycle management
Streamlines the ISO certification process
Nearly two-thirds of HR managers feel AI is changing the skills needed in today’s workplace
On the importance of data governance in the development of complex products
Base your cloud strategy on reliable information
Forecasts S&A subsector to grow 9.2% in 2023
How to consistently make optimal choices in business and life
Embrace mistakes as valuable opportunities for improvement

More News

Gilles Hilary

Management

Embrace the Fuzzy Crystal Ball

Complex models that accurately forecast the future are a poor way to plan for uncertainty

Published: Monday, April 4, 2016 - 00:00

Phil used to be a very senior financial executive. When asked for a number, he would typically provide with a rough ballpark answer, such as, “It’s about 5 percent.” He’d then be peppered with questions about how he had arrived at that figure.

After a while, he got tired of this questioning and started to bring a stack of financials with him to every meeting. From then on, instead of providing an approximate but effectively accurate answer, he would instead turn to his printout, thumb through the pages, and then randomly point to a specific line and answer by saying “It is 4.96 percent.” The questions stopped. The oracle had spoken.

Phil’s experience is not unique. Humans tend to dislike uncertainty. For example, many people are happy to play roulette, despite its inherent risk and expectation of loss, but only a few are willing to participate in a wager if the odds are not clearly defined, even if they can choose their side of the gamble.

Models, particularly those with a veneer of complexity and sophistication, cater to this aversion. Various academic studies suggest that seemingly more precise numbers can act as more potent anchors, and that when presented with more information, people tend to be even more overconfident of their capability even though their actual performance does not improve. Data do not guarantee knowledge.

By looking at Phil’s 4.96-percent figure, we are all the more likely to anchor more strongly on that view and feel even greater overconfidence towards the precision of that measure. In fact, many people don’t want to be bothered with details, particularly when data go against their prior beliefs. For example, studies are less likely to change peoples’ minds when they provide more information about the way they were conducted. Knowledge can be a curse.

Anchored in risk models

Naturally this has implications for risk management. Risk modeling has made incredible strides, particularly in financial markets, and experts have far more sophisticated indicators of what their risk positions are than ever before. Along the way, businesspeople ceased to follow a conversation based on esoteric mathematical concepts. If you can’t convince, confuse.

But even simpler things are being missed in the conversation. For example, one of the most common measures of risk in the financial sector is value at risk (VAR). VAR provides a sense of the volatility that can be expected over a certain period given the investment made by the firm. The higher the VAR, the greater the risk. Recently, VAR measures have dropped. Unfortunately, the drop is because the indicator is typically based on experience throughout the previous five years. Observations during the financial crisis were removed from the calculation, lowering the value of the indicator. Needless to say, the underlying risk has not been affected. The indicator itself is not the risk.

Models aren’t reality

At the same time, the most notable problems have arisen from issues of uncertainty, as unpredictable surprises swamp businesses. For example, the various banks that failed or suffered during 2008 had wonderfully complex risk models, yet they failed to consider the possibility of a major increase of correlations among individual instruments or cascading asset classes that eventually expanded to include regulatory and structural changes to markets. The surprise was that models had not anticipated this. Reality is a stubborn thing.

Twenty years ago, a major hedge fund, Long-Term Capital Management (LTCM), was run by finance veterans: a small army of Ph.D.s and no less than two Nobel Prize winners. In 1998, it nearly caused a global financial meltdown as the same increase in asset correlations happened. We learn from history that we do not learn from history.

Manage uncertainty, not just risk

These uncertainties, by their very definition, aren’t something that can be included in the typical risk model, but that doesn’t mean they should simply be ignored. Blaming failure on a black swan isn’t particularly useful either. Careful consideration of and preparation for the types of disruptions that can occur due to unknown events is still prudent planning and can aid business operations in times of trouble. Plans may be useless, but planning is indispensable.

With the idea that it’s never too late, risk professionals can take steps to better frame their environment, including formatting, vocabulary, and the visual display of information. Such steps might seem basic compared to the advanced mathematics and consideration that go into today’s risk models, but they’re opportunities to bridge the gap between managing risk and managing uncertainty. Sometimes less is more.

Phil may save a lot of time by presenting overly precise answers, but he’ll get far more insightful discussion in return if he starts conversation with a recognition of uncertainty—and embrace fuzziness inherent in any crystal ball.

This article is republished courtesy of INSEAD Knowledge. © INSEAD 2016.

Discuss

About The Author

Gilles Hilary’s picture

Gilles Hilary

Gilles Hilary is an INSEAD professor of accounting and control and the Mubadala chaired professor in corporate governance and strategy. He is also a contributing faculty member to the INSEAD Corporate Governance Initiative. Hilary regularly teaches courses on corporate governance, risk management, financial analysis, decision making processes, and behavioral finance. He has an MBA from Cornell University, a Ph.D. from the University of Chicago, and a French professional accounting degree.