IFC: what central banks should do in setting up next-generation tech

Public authorities, and central banks in particular, are increasingly realizing the potential of big data sets and analytics – with the development of artificial intelligence (AI) and machine learning (ML) techniques – to provide new, complementary statistical information. Yet the question remains: how should institutions organize themselves to benefit the most from these opportunities?

Two areas appear particularly important for central banks. The first is how to organize their statistical information in relation to their IT infrastructure. The second is to think strategically as to how to use appropriate techniques to further process and analyze the new information collected. Like many national statistical offices (NSOs) and international organizations, central banks have already launched numerous initiatives to explore these issues and exchange on their experience, in particular through the cooperative activities organized by the Irving Fisher Committee on Central Bank Statistics (IFC) of the Bank for International Settlements (BIS).

The BIS itself has developed its new medium-term strategy, Innovation BIS 2025, which relies on important investment in next-generation technology to build a resilient and future-ready digital workplace for the organization. A key feature of these various initiatives is that many central banks are currently setting up, or envisaging implementing, big data platforms to facilitate the storage and processing of very large data sets. They are also developing high-performance computing (HPC) infrastructure that enables faster processing, in-depth statistical analysis and complex data simulations.

However, these initiatives face important organizational challenges, as central banks trade-off factors such as technology trends, system complexity, cost, performance, reliability, operating model and security. In view of this experience, it is essential to carefully assess the options available before selecting a technology and architecture to set up big data / HPC platforms.

Among the various issues to be considered, attention should primarily focus on the hardware selection, the choice between proprietary and open source technology, the decision to develop the solution in-house or in the cloud, and the type of information to be handled.

Once the main options are selected, the actual implementation of the related technologies is often a long and multiform journey. From a project development perspective, success will depend on the approach for conducting the project, the types of workload to be supported, the data architecture envisaged and the use of software development best practices. Having a broader, institution-level perspective is also key, so as to adequately take into account the full range of business requirements as well as resources and security constraints.

It may therefore be recommended to develop a comprehensive information strategy for the institution, with a high-level roadmap for the adoption of continuously changing technologies to manage data and respond to users’ needs. Last but not least, knowledge-sharing can be instrumental, and can be facilitated by the cooperative activities promoted by the BIS and the IFC.

Read the full report

Related Posts

Previous Post
Equity derivatives: why scrip and rights trades have taken a hit in the crisis (Premium)
Next Post
BIS’ Cœuré: not too late for tech setup that mitigates COVID-19 impacts

Fill out this field
Fill out this field
Please enter a valid email address.

X

Reset password

Create an account