RE•WORK: AI applications summit hosts banks, regulators and exchanges

RE•WORK’s recent summit in September for AI applications had financial industry representation from Nasdaq, National Bank of Canada, Goldman Sachs, the New York Fed and Bank of England on a variety of topics.

Read the full post-event report

From the Bank of England, Eryk Walczak, senior research data scientist, presented on “Measuring Complexity of Banking Regulations Using Natural Language Processing & Network Analysis”

Abstract: the banking reforms that followed the financial crisis of 2007–08 led to an increase in UK banking regulation from almost 400,000 to over 720,000 words, and to concerns about their complexity. The Bank of England’s research team define complexity in terms of the difficulty of processing linguistic units, both in isolation and within a broader context, and use natural language processing and network analysis to calculate complexity measures on a novel dataset that covers the near universe of prudential regulation for banks in the United Kingdom before (2007) and after (2017) the reforms.

Linguistic, i.e. textual and network, complexity in banking regulation is concentrated in a relatively small number of provisions, and the post-crisis reforms have accentuated this feature. In particular, the comprehension of provisions within a tightly connected ‘core’ requires following long chains of cross-references.

Key takeaways

  • AI/ML techniques can be used to study complexity of banking regulations
  • They describe the changes to the UK banking regulations before and after the Great Financial Crisis (2007 vs. 2017)
  • They develop a new dataset that can be used for other purposes. This research can be seen as an early step towards automating banking regulations (regtech)

Key quotes

“We found four facts on the textual complexity of post-crisis reforms. One being the tighter core in the network emerging centred around CRR, the legal style limits complexity of language in individual rules, at least ⅓ of the rules contained vague terms that required substantial interpretation and that we validated our measures using EBA Q&A and a case study on definition of capital”

“Our measures of complexity are also derived from linguistics with lexical diversity, conditionality and length as measures. These measures capture local complexity, i.e. cognitive costs incurred while reading a rule”

Watch the full video

Related Posts

Previous Post
Fed prohibits large bank share repurchases and caps dividends for Q4 2020, ensuring ample balance sheets for year end
Next Post
European CCP group warns on IOSCO’s “burdensome” outsourcing principles draft

Fill out this field
Fill out this field
Please enter a valid email address.

X

Reset password

Create an account