We are revisiting an old question that has come up again as we are working on due diligence of tri-party collateral managers: how should illiquid collateral assets be priced? At the same time, the Fed released a new report today asking a version of the same question although their focus is valuations for regulatory purposes. For whatever purpose, we think it is time for a global valuations standard.
In an ideal world, all assets would be regularly traded and there would be a recent settlement price that the market has agreed to. After all, every securities price is the result of what “the market” thinks it is. Pricing models may be interesting but in the end, if the market devalues a product by 10%, then the price will fall by 10% from the previous day. The updated price will be disseminated by the big data vendors, brokers and will also be found in clearing house settlement systems.
In our tri-party survey, we find some tri-party managers (agents) may on occasion use broker prices to value their assets. This is akin to brokers delivering pricing for OTC derivatives, then the client sends those prices to triOptima and triOptima compares against the client and the broker. Clearly there will be no argument in the pricing. Looking at this from the perspective of client independence, the fox is most certainly in the henhouse.
But if a product hasn’t traded for a while, how should it be valued? Some methodologies require checking three quotes with competing brokers. There is a reasonable assumption that getting three bid or offer quotes for an individual security would yield a market price. This is entirely unreasonable however when dealing with thousands of security prices a day. Even the brokers would have to automate their request lines just to meet the demand. Let’s put this solution aside for now.
The Fed’s new blog post and linked staff report, “No Good Deals—No Bad Models,” by Nina Boyarchenko, Mario Cerrato, John Crosby, and Stewart Hodges proposes a model based on Sharpe Ratios and a presumption that investors expect a worst-case scenario. In her blog post, Ms. Boyarchenko makes an interesting argument:
“We assume that investors are averse to model uncertainty, and thus require that investors’ decisions be made using the worst-case distribution among the set of distributions that they consider. To find the worst-case distribution, the investor evaluates her utility under each distribution in the set of alternatives, and picks the distribution that delivers the lowest expected utility.”
She continues:
“We show, moreover, that increased model uncertainty leads to fundamentally different bounds than an increase in the maximum Sharpe ratio. While the maximum Sharpe ratio restriction corresponds to a bound on the volatility of returns, aversion to model uncertainty introduces a consideration for negative skewness into the computation of lower and upper bounds on asset valuations.”
We aren’t certain that this model can be reliably applied to thinly traded securities but we think it is worth exploring. At this point, actually, we think that it is worth exploring a good number of avenues, bank models and asset manager models to see what might work on a global basis.
As an idea for a simple alternative to a complex model, market practitioners might agree on a methodology where a price that is X days old is no longer viable, and hence any corporate bond or ABS posted as collateral with that stale a price is no longer eligible as collateral or must carry a hefty haircut.
We are well aware that this is an old topic: a plethora of academics and market participants have developed models to tackle this valuation problem. What we and the Fed are seeking now is a practical solution that can be accepted globally. Let us know if you have one – we are interested to hear about it.