Researchers from the Bank of Finland consider predicting systemic financial crises one to five years ahead using recurrent neural networks. The prediction performance is evaluated with the Jorda-Schularick-Taylor dataset, which includes the crisis dates and relevant macroeconomic series of 17 countries over the period 1870- 2016.
Recurrent neural networks (RNNs) are a family of neural networks designed for sequential data such as language and time series. The RNN accepts input data sequentially, which allows RNNs to use their hidden states (akin to memory) dynamically to process a sequence of input data. A key idea is parameter sharing, which restricts the number of parameters in the model and helps avoid overfitting. In this study, researchers consider three different RNN architectures: basic RNN, RNN with long-short term memory (LSTM) cells, and RNN with GRU cells.
Previous literature has found simple neural network architectures to be useful in predicting systemic financial crises. Researchers show that such predictions can be greatly improved by making use of recurrent neural network architectures, especially suited for dealing with time series input. The results remain robust after extensive sensitivity analysis.