The Depository Trust & Clearing Corporation (DTCC) issued a whitepaper, “Data Strategy & Management in Financial Markets”, that identifies data management challenges, highlights themes to drive an evolution in financial market data exchange and data management over the next decade and outlines the foundation needed to support change.
Kapil Bansal, managing director and head of Business Architecture, Data Strategy & Analytics at DTCC, said in a statement: “For many years, companies have collected massive stores of data, but the siloed nature of data and the need for better data quality limited the ability of market participants to extract strategic insights to support more effective decision-making. We’re now at a unique moment where we can make this vision a reality, but long-term success hinges on market participants taking action to ensure their data strategy can meet the demands of a highly digitalized and interconnected marketplace.”
DTCC’s latest whitepaper details four hypotheses that will drive how data is used in financial markets in the future:
- Data will be more accessible and secure: data users will have increased flexibility in determining how and what data is received at desired times. To enable this, data governance, privacy and security will need to be prioritized.
- Interconnected data ecosystems as a new infrastructure layer for the financial industry: industry participants will free their own data from legacy systems and be able to pool it into data ecosystems and connect those ecosystems to others. This will reduce duplication of data across the industry and allow for the co-development of innovative data insights.
- Increased capacity to focus on data insights: more efficient data management, cloud enabled capabilities, and further automation of routine data management tasks will free up capacity and accelerate time to market for new product development, reducing the need for specialized data analysts and data operations teams to focus on deriving insights from vast stores of data.
- Ubiquity of “open source” data standards: it is anticipated that the industry will continue to adopt more standards around data models, with the most viable use cases being reference and transaction reporting data. This will result in increased operational efficiency and better data quality.
To enable these changes, the whitepaper suggests institutions that produce and consume significant amounts of data embed key principles into their data operating models, including:
- Establishing robust foundational data management capabilities, including having a thorough understanding and catalog of data, breaking down data silos and implementing robust data quality practices.
- Supporting strong data governance, including the right set of data privacy and security standards to enable data collaboration with partners.
- Exploring where there is mutual benefit from collaborative data environments across firms and the industry to advance interoperability.
Applying these principals will help market participants gain access to data that is trapped or underutilized today and allow for new and faster insights.