Finadium’s upcoming Rates & Repo Europe conference is gearing up for a thorough look at the major dynamics reshaping repo markets in 2024 and beyond. Ahead of the March 14 event, we speak with our Big Data, Big Technology, Big Opportunities panelist experts from Euroclear and Tradeweb about how data is already being used from pre- through post-trade as well as the role of market infrastructures in data quality and delivery.
The “power of data” makes industries and markets a lot more resilient, transparent, efficient and, ultimately, a lot more mature, said Nicola Danese, co-head of International Developed Markets for trading platform Tradeweb. And while repo may be a laggard, the team is seeing the market catch up to entrenched electronification trends similarly to other products and asset classes, particularly in the growing importance of being strategic with data.
Moreover, data derived from electronic execution is “a great way to preserve data quality”, and Danese stressed the extent to which other kinds of data aggregation processes using normalization techniques could struggle to achieve the same data quality. At the same time, even compared to other largely bilateral markets, repo stands out for its complexity, with pricing a function of different tenors, counterparty credit risks, haircut levels, among other factors, all of which make observable repo data difficult to work with.
When it comes to products built on repo data on the trading side, clients are asking for a variety of things such as collateral valuation, a tape, visibility into bilateral haircuts and building out a repo curve. The latter is where Tradeweb’s team is focused on.
The initial moves in this space have been in tandem with Tradeweb’s data science team to develop a repo curve for five government issuers in Europe — UK, France, Germany, Italy and Spain. The result estimates a GC curve for each issuer, identifies specials, and furthermore creates a specials curve. This is then used on the buy-side for sourcing collateral or analyzing relative value trades, or on the sell-side for an independent control function in pricing trading books, for example at month-end, Danese explained.
“There is the usual asymmetry here, the buy-side is more interested because their starting point is they have less transparency compared to the sell-side,” he noted.
Reconciling data differences
Data can mean different things across market participants, said Olivier Grimonpont, managing director and head of Product Management for Market Liquidity at the financial market infrastructure, and he breaks that down into three categories: traditional data, asset class differentiation, such as for ESG or new markets, which tend to be static; and value-added data, such as valuation, position, activity, and balances. The latter tends to generate dynamic data.
Euroclear provides settlement and custody of domestic and cross-border securities for bonds, equities and derivatives, and investment funds, and as such sits on considerable quantities of data that can be used for “value-added services”, he noted: “We provide a huge set of data that is coming from multiple sources, and those data can then be used either by ourselves, and we are providing value-added services, like simulation, optimization, but we also allow our customer to take the data that we have and run their own value-added services or even outsourcing it to a third party.”
How that data is then used and deployed is the choice of the client — for pre-trade analytics to determine eligibility of products or counterparties for trading; liquidity metrics; or in directing post-trade collateral allocation — in order to reduce risk-weighted assets (RWA), or improve LCR and NSFR positioning, for example.
Data for optimization
One of the points of friction for securities financing is in reconciling between the need for granularity versus harmonization as part of optimizing collateral management, Grimonpont explained: “How do you compare one eligibility set versus another one? There’s a need for huge harmonization that will allow basket trading and there is also a need for really granular activity that could go down to a single ISIN for bilateral financing.”
Optimization is high on the list of client demands, with an eye on reducing balance sheet impacts or RWA, for example, weighing alternative routes such as guaranteed repo. “All those innovations (are) going to a very clear direction, which is to minimize the cost of running a repo book on the balance sheet,” said Grimonpont.
While this is traditionally driven by sell-side, he added that the buy-side will suffer from those challenges, particularly non-rated entities on the back of Basel III Endgame: “The cost of doing SFTs with those (firms) will massively increase the risk weighted asset of their bank counterparty going forward. It’s not a single-sided problem anymore…if the repo world needs to continue to work and grow, and support the industry as it does today, we need to make sure that there are solutions out there that will enable the banks and the dealers to continue to provide that service to the buy-side.”
Nicola and Olivier will be joining colleagues from DekaBank and Finadium on the panel “Big Data, Big Technology, Big Opportunities” at Rates and Repo Europe, when they will discuss these and other major implications of data in repo markets. Rates & Repo is a conference for cash investors, dealers, market intermediaries, technology firms and other service providers. Register here for the in-person panel discussions on March 14.