DP14025 When the U.S. catches a cold, Canada sneezes: a lower-bound tale told by deep learning

Author(s): Vadym Lepetyuk, Lilia Maliar, Serguei Maliar
Publication Date: September 2019
Date Revised: September 2019
Keyword(s): central banking, clustering analysis large-scale model, deep learning, Machine Learning, neural networks, New Keynesian Model, supervised learning, ToTEM, ZLB
JEL(s): C61, C63, C68, E31, E52
Programme Areas: Monetary Economics and Fluctuations
Link to this Page: cepr.org/active/publications/discussion_papers/dp.php?dpno=14025

The Canadian economy was not initially hit by the 2007-2009 Great Recession but ended up having a prolonged episode of the effective lower bound (ELB) on nominal interest rates. To investigate the Canadian ELB experience, we build a "baby" ToTEM model -- a scaled-down version of the Terms of Trade Economic Model (ToTEM) of the Bank of Canada. Our model includes 49 nonlinear equations and 21 state variables. To solve such a high-dimensional model, we develop a projection deep learning algorithm -- a combination of unsupervised and supervised (deep) machine learning techniques. Our findings are as follows: The Canadian ELB episode was contaminated from abroad via large foreign demand shocks. Prolonged ELB episodes are easy to generate in open-economy models, unlike in closed-economy models. Nonlinearities associated with the ELB constraint have virtually no impact on the Canadian economy but other nonlinearities do, in particular, the degree of uncertainty and specific closing condition used to induce the model's stationarity.