Every model of an imperfectly known environment must contemplate two kinds of risk: that incorporated within the model and the risk of failure of the model itself.
Amaranth, the hedge fund group destroyed by energy market losses, boasted of the sophistication of its risk models. One attempt to reproduce its risk metrics suggests that it lost more than half its capital from a nine sigma event – a probability so low that such events simply do not happen. It would be like being struck by lightening and attacked by a mad axeman, just at the moment you suffer a fatal heart attack.
But similarly inconceivable contingencies do crop up. Long Term Capital Management was also the victim of a perfect storm. The one-day market fall in October 1987 was an event that would still have been unlikely even if the New York Stock Exchange had been open since the moment the earth was formed.
To understand the problem, it is better to look at much simpler models than the arcane ones needed to understand modern derivatives markets. Imagine yourself arriving at a bus stop, knowing the frequency of the buses – every 10 minutes – but not the exact arrival time. If buses keep exactly to schedule, the probability that one will arrive in the first minute is one in 10. If a bus does not arrive quickly, the likelihood of one appearing soon increases. After nine minutes, you can be certain that a bus will come in the next minute.
No one who has ever waited for a bus trusts this model. A better approach would make the frequency stochastic – buses arrive every 10 minutes on average, but with wide variability. This is the kind of model used in financial markets. It still predicts that the longer you have waited, the more likely it is a bus will arrive imminently.
No one who has waited for a bus or a friend or a pat on the back trusts that conclusion either. At first you have confidence in the model. The bus will adhere to its uncertain schedule. Both you and your friend have intended arrival times based on the agreed time of meeting. Your talents will eventually be recognised. But after an interval, you mistrust the original model. Perhaps there was an accident en route or a misunderstanding over the meeting place. Or perhaps your company does not value you as it should.
That is why your confidence in the rapid arrival of a bus rises at first but then falls. After a sufficiently long interval without a bus, no one remains at the bus stop. Ordinary people are too savvy for that. But there are always a few investors who go on asserting the rightness of their judgment in spite of the evidence before their eyes.
Any mathematical model of an uncertain environment must contemplate two types of risk. The first is incorporated within the structure of the model itself. If buses leave the garage at 10-minute intervals, what is the frequency distribution of their arrival time at a particular stop on the route? Such risks can be described using standard statistical distributions and historical data that reflect traffic conditions
and the variable performance of drivers. Such techniques form the basis of the “value at risk” modelling employed in the financial community.
The second type of risk is uncertainty about whether the model you have developed describes the world accurately – either in the past, from which the data you employ are drawn, or in the future, in which the models you derive from them will be used. Such uncertainties necessarily exist and are unquantifiable.
A model of an imperfectly known environment must contemplate these two kinds of risk – one incorporated within the model and the other presented by failure of the model itself. People at the bus stop implicitly use this reasoning and more sophisticated commentators would do well to follow suit.
Attaching probabilities to forecasts is certainly better than clairvoyance. When someone does attach a probability to a forecast, they have – implicitly or explicitly – used a model of the problems. The model they have used accounts for in-model risk but ignores off-model risk. Their forecasts are therefore too confident and neither you nor they have much idea how over-confident they are. That is why mathematical modelling of risk can be an aid to sound judgment, but never a complete substitute.