A May 17, 2012 article by Marcus Cree in Futures and Options World (and reposted in the blog SunGard Ten) about Value at Risk (VaR) was both humorous and interesting. Cree, apparently an avid Star Wars fan, called the post “A Long Time Ago, in a Galaxy VaR, VaR Away” and filled it with references to Star War trivia (although the author would almost certainly disagree that it was trivia). Cree writes about how VaR is a central component of risk management in the central clearing of derivatives…despite it not being designed for how it is used.
Cree says that VaR was used as a consistent measure of risk across a variety of products. He wrote, “…By determining the sensitivities of all of the traded instruments at a firm, to a range of the most commonly observed drivers (interest rates, volatility, FX rates), the problem could be reduced to then using those sensitivities in a range or distribution of scenarios, itself determined by a correlation process across the observable market data, and finding the worst results at required percentiles (i.e. the 99th worst result)…” Essentially VaR would shock a position across a variety of scenarios, and see what would happen. A sub-set of the results was used, typically pegged at the 99th percentile. The scenarios were constructed using correlations – so that, for example, a short bond position in a US Treasury might offset a long position in a corporate bond as long as there was demonstrated correlation between the two.
VaR evolved over time to include historic VaR and Monte Carlo VaR. “…Historic VaR uses actual histories to directly create the scenario distribution, while Monte Carlo uses statistical analysis and correlation data to generate a range of possible future scenarios. Portfolios are then valued under each of the scenarios and the worst cases at specific percentiles are again found and reported…”
Cree wrote that VaR is the “minimum loss that should occur, at the frequency of the percentile selected.” This differs from what VaR, as the holy grail of risk measurement, actually became – a metric to determine a maximum loss. The effect was to remove from the consciousness of risk managers what the tail risks were – those disasters that are infrequent but potentially more dangerous. When those disasters struck, VaR was blamed for being inadequate, when in reality the numbers that VaR calculators spit out, which were treated as gospel, were simply being used incorrectly.
Cree comes back to central clearing by noting that the margining process relies of VaR risk calculators to establish daily margin amounts. We think he may have meant initial margin amounts instead. Daily margining simply takes the cash flows and discounts them back using an appropriate curve and looks at the change – we don’t see where VaR necessarily comes into that. But for figuring out initial margin, where a CCP would have to predict how much margin would be necessary to protect them if there was a counterparty default, the CCP would need some sort of crystal ball telling them how far the market might move — and that is where VaR comes in. Nevertheless, the point is that a metric that really wasn’t designed to measure tail risk is being used to do just that.
A link the Futures and Options world article is here.
A link to the SunGard Ten post is here.