Anil Gaba, chaired professor of risk management at INSEAD, gave the CFA Institute Middle East Investment Conference a range of practical pointers to help navigate the difficult process of making decisions under conditions of uncertainty.
Gaba began by questioning popular usage of the expression “black swan,” the term coined by Nassim Nicholas Taleb to mean a rare event that is not only completely unexpected but also difficult to imagine in advance, in other words, suggesting we could be forgiven for failing to forecast it. Gaba is not so convinced that some events are totally unpredictable. Instead, he says he finds predictability everywhere, especially in finance and financial models, which he is almost weary of poking fun at. Complex models regularly have less predictive power to interpret uncertainty than even simple judgments.
Gaba argues that much uncertainty can be usefully reduced to “subway uncertainty” or “coconut uncertainty.” Subway uncertainty captures that part of uncertainty that is fairly predictable — for example, the quantity of water that will be used by Dubai or London between certain hours of the day or the prospects of catching a subway train on time. Statistical properties in subway uncertainty are well-known and fall into specified limits.
Coconut uncertainty is quite different and refers to events that we can imagine occurring but certainly not with any frequency, such as being hit by a coconut or eaten by a crocodile. For example, a prominent VaR model, using data between 1916 and 2003 and excluding the most turbulent periods, finds that the Dow Jones should only have exhibited daily movements of 7% once in 300,000 years. Yet this has, in fact, happened 48 times over the period. These cannot be black swan events, which by definition are very rare. These are “coconuts constantly falling on our heads, and we are ignoring them,” Gaba says.
“Financial markets, the price of oil, earthquakes, life expectancy, success, happiness — all these are a mixture of the two (subway and coconuts),” says Gaba. He stated that when we use models, we capture “subway uncertainty,” and the same is true when we use judgments. He believes this is because it is easier to comprehend and we want to forget about “coconuts,” which create a lot of trouble for us.
Gaba took pains to distinguish between black swans and coconut uncertainty. “We have the known knowns, the known unknowns — some we can model, some we can’t — and then you have the black swans, which are the unknown unknowns,” Gaba says. “Coconut uncertainty is not just because of the very rare black swans but also because of the much more frequent known unknowns.”
He argued that one solution is to accept that rational models, such as those used in the financial markets, do not predict rational behaviour but perhaps do predict a systematic deviation. Investors should combine such models with human psychology and sociology in order to make more sense of uncertainty. But Gaba does not find too much help in behavioural finance. He believes the main finding of behavioural finance is that as human beings dealing with uncertainty, we tend to fall into the same predictable traps again and again. Investors may find it tough to step off this moving stairway of behavioural biases.
Another step in Gaba’s solution is to understand the danger of the “illusion of control,” particularly that supplied by quantitative work. “Complex models can fit the past data well, but that doesn’t necessarily mean that they fit the future well,” he says. It is no big deal to fit a function to a data series, but it does not mean that it will predict as well as simple models, he added. A key point raised about forecasting was that we have a strong tendency to make our high estimate too high and our low estimate too low, which is hard to argue with.
Under Gaba’s theory of illusion of control, things that we assume depend on skill and graft in fact depend mostly on chance. “Even when the dangers of illusion of control are high and the stakes are very high, we still keep on falling into this trap,” Gaba says. “If we fall into this trap, we underplay the role of chance, we underestimate uncertainty.” The corresponding benefit is that if we realise our limits to predictability and we do not try to control it, then paradoxically we can have more beneficial outcomes.