You may disagree with the number of years or the percent, but everyone understands that automation in the funding and collateral space is occurring at a fast pace. The question is how you prepare for this inevitable future? Our view is that connecting data from disparate sources is the key to the next evolution in the funding markets. A guest post from Transcend.
Who in the capital markets industry isn’t seeking greater profitability or returns? From balance sheet pressures and competitive dynamics to more resources to comply with regulation, focusing on transformative change to advance the firm has been a huge challenge. At the same time, technology is evolving at a rapid pace and the availability of structured and unstructured data is presenting a whole new level of opportunities. For firms to realize this opportunity, connecting disparate data and adopting smart algorithms across the institution are a critical part of any strategy.
Advances in technology have allowed data to be captured and presented to traders, credit, regulators, and operations. But right now, most data are fragmented, looking more like spaghetti than a coherent picture of activity across the organization. Individual extracts exist that sometimes cross silos, but more often cannot be reconciled across sources or users. To be effective, data needs to flow from the original sources and be readable by each system in a fully automated way. It does not matter if individual systems are old or new, in the cloud or behind firewalls, from vendor packages or in-house technology: they all have to work together. We call this connected data.
Businesses have understood for some time that this will require growth of automation, which will be a critical driver of success. Banks and asset managers know that they have to do something: doing nothing is no option at all. Machine learning and artificial intelligence are part of the solution, and firms have embarked on projects large and small to enable automation under watchful human eyes. The new element to consider in the pace of change is the ability of machines to connect, process and analyze data within technology platforms for exposure management, regulatory reporting and pricing. The more data that feeds into technology on the funding desk, the more that automated decision-making can occur.
While individual systems and silos can succeed on their own, a robust and integrated data management process brings the pieces together and enables the kinds of decision-making that today can only be performed by senior finance and risk managers. Connected data is therefore possibly the most important link between automation and profitability. It is a daunting task to consider major changes to all systems that are in play, but most firms are adopting a strategy to build a centralized platform that brings data from multiple businesses and sources. A key benefit of this strategy is that advances in technology and algorithms can be applied to this platform, enabling multiple businesses or potentially the whole enterprise to benefit from this investment.
The risk of inaction
Connected data can stake its claim as the new, most competitive advantage in the markets. Like algorithmic trading and straight-through processing, which were once novelties and are now taken for granted, the build-out of a connected data architecture combined with the tools to analyze data will initially provide some firms with an important strategic advantage in cost and profitability management.
With all the talk about data, there is an important human element to what inaction means. In a data-driven, technology-led world, having more or all the right people will not stop a firm from being left behind, and in fact may become a strategic disadvantage. The value of automation is to identify a trade opportunity based on its characteristics, the firm’s capital and the current balance sheet profile. Humans cannot see this flow with the same speed as a computer, and cannot make as fast a decision on whether the trade is profitable from a funding and liquidity perspective. While the classic picture of a trader shouting across a room to check whether a trade is profitable makes for a good movie scene, it is unwieldy in the current environment. A competitor with connected data in place can make that decision in a fraction of the time and execute the trade before the slower firm has brought the trade to enough decision-makers to move forward.
The competitive race towards connected data means that firms with more headcount will see higher costs and less productivity. As firms with efficient and automated funding decision tools employ new processes for decision-making, they will gain a competitive advantage due to cost management, and could even drive spread compression in the funding space. This will put additional pressure on firms that have stood still, and is the true danger of inaction at this time.
Action items for connected data
Data is only as good as the reason for using it. Firms must embark on connecting their data with an understanding of what the data are for, also called foundational functionality. This is the initial building block for what can later become a well-developed real-time data infrastructure.
Each transaction has three elements: a depository ladder for tracking movements by settlement locations; a legal entity or trading desk ladder; and a cash ladder. Each of these contain critical information for connecting data across the organization. If your firm has a cross-business view of fixed income, equities and derivatives on a global basis, then you are due a vacation. We have not yet seen this work completed by any firm, however, and expect that this will be a major focus for banks through 2019 and 2020.
Ultimately, an advanced data infrastructure must provide and connect many types of data in real-time, such as referential data, market data, transactions and positions. “Unstructured” data, such as agreements and terms, capital and liquidity constraints, and risk limits, must also be available more broadly for better decision-making, despite their tendency to be created in some specific silo. But an important early step is ensuring visibility into global, real-time inventory across desks, businesses, settlement systems and transaction types; this is critical to optimize collateral management. Access to accurate data can increase internalization and reduce fails, cutting costs and operational RWA. This is especially important for businesses that have decoupled their inventory management functionality over time, for example, OTC derivatives, prime brokerage and securities financing. Likewise, the ability to access remote pools of high-quality assets, whether for balance sheet or lending purposes, can have direct P&L impacts.
Step two is the development of rules-based models to establish the information flows that are critical to connecting data across a firm and simultaneously optimizing businesses on a book, business entity, and firm levels. The system must understand a firm’s flows and what variables they need to monitor and control within a business line and across the firm. Data will push in both directions, for example to and from regulatory compliance databases or between settlement systems and a trader’s position monitors. Rules-based systems simplify and focus on what is otherwise a very complex set of inter-related and overlapping priorities (see Exhibit 1).
Connected data can enable significant improvements such as:
- Regulatory models can be fed on a real-time pre-trade “what-if scenario” so businesses can know how much a particular trade absorbs in terms of capital, liquidity or balance sheet for the given return, or if a trade is balance sheet-, capital- or margin-reducing.
- Data can feed analytics that tell a trader, salesperson, manager or any stakeholder what kind of trades they should focus on in order to keep within their risk limits, with information on a granular client level.
- XVA desks, the groups often charged with balancing out a firm’s risk and capital, can not only be looped in but push information back to a trader in real-time so they can know the impact of a trade.
- Systems that track master agreements can be linked and analytics can point toward the most efficient agreement to use for a given trade.
- Trading and settlement systems can interface with market utilities, both backward and forward.
- Transfer pricing tools can be built into the system core and be transparent to all stakeholders with near instantaneous speed, at scale.
Transcend’s recent experience with some of the top global banks shows the value of consolidating data into one infrastructure. We are connecting front- and back-office to market infrastructure and providing information in a dashboard, in real-time. As trades book on the depository ladder, key stakeholders can see the change in their dashboard application and can make decisions on funding manually or feedback new parameters to pricing models across the enterprise. The same transaction and positions affect the real-time inventory view from legal entity or customer perspectives as well as driving cash and liquidity management decisions. Over time, as banks gets more comfortable with their data management tools, parts of decision-making that follow specific rules can be automated. This will be an excellent deployment of the new data framework.
Betting on the time or the percent
As machine learning and AI advance, and connected data becomes more of a reality, technology platforms will learn how to efficiently mine and analyze data to understand if a trade satisfies institutional regulatory, credit, balance sheet, liquidity, and profitability hurdles. This will lead to an environment where a trade inquiry comes in electronically, is accepted or rejected, and processed automatically through the institution’s systems. The steps in this process are methodical, and there is nothing outside of what financial institutions do today that would prevent execution. A reduction in manual intervention can allow traders to focus on what is important: working on the most complex transactions to turn data into information and action.
The fact that more automation is occurring in funding markets is certain. The question at this time is how long will it take to automate most of the business. This is a bet on the timeline or the percent to which funding decisions can be automated but not the direction of the trend line. Could it be as much as 90% in five years? Answer will vary by the firm and some of the major players are already developing strategies to progress in this direction. Typically, people overestimate the impact of a new technology in the short term, but underestimate the impact in the long term. Banks have already invested in machine learning and AI tools to make automated funding a reality. But it will depend on the next and more complex step: to ensure that connected data can reach these tools, allowing for a robust view of positions, regulatory metrics and profitability requirements across the firm.
Bimal Kadikar
Co-Founder and CEO, Transcend
Bimal Kadikar is a co-founder and the CEO of Transcend, a technology firm dedicated to helping financial institutions optimize collateral and liquidity enterprise-wide to increase efficiency, reduce risk and drive greater business performance. Bimal leads Transcend’s business and product strategies, working closely with major global banks to implement modular, innovative technology solutions that connect the front-to-back office for sharper funding and capital decisions.
Bimal previously served in several senior roles at Citi Capital Markets Technology Division. He also led the global technology organization for Citi’s Fixed Income, Currencies and Commodities (FICC) businesses, and built the Prime Finance, Futures & OTC Clearing technology platform, expanding Citi’s growth. Bimal spearheaded Citi’s firm-wide post-2008 crisis strategy for Collateral, Liquidity and Margin, leading to the inspiration for founding Transcend’s next-generation collateral and liquidity management solutions.
1 Comment. Leave new
[…] This article was originally published on Securities Finance Monitor. […]