The ECB hosted its second Annual Research Conference on 25-26 September 2017. The event brought together academia and central banks working at the cutting edge of economics, covering a wide range of research relevant for the ECB. The conference featured both empirical and theoretical contributions on themes such as secular stagnation, the effects of negative interest rates, and exit from non-standard monetary policy.
In his opening remarks, Vice-President Vítor Constâncio highlighted the value that the ECB attaches to research, as “it contributes to shaping the intellectual framework that we use to understand economic developments and to take policy decisions”. He stressed that the field of macroeconomics is still adjusting to the questions raised by the Great Recession. As a monetary policymaker, Constâncio emphasised the usefulness of flexible models that can be adapted swiftly to address new questions in a timely manner.
Jean Monnet lecture: The pension dilemma
The keynote lecture was given by Nobel laureate Peter Diamond of MIT and chaired by the ECB’s Peter Praet. Talking about “good pension design”, a rather problematic task that has to take into account the properties of capital markets and the sharing of economic risks, Diamond made a distinction between pension systems and pension plans, adding that it is a mistake to do a full-blown analysis on normative issues on a plan without the context of the rest of the system.
The focus was on defined contribution pension systems. The turning point for defined contribution pensions, Diamond said, was what took place in Chile in 1981 under the Pinochet military dictatorship. The government started with a defined contribution system together with two provisions: a guaranteed minimum benefit for people who contributed at least 20 years, and a safety net for the elderly. In 1981 Chile had a good plan, but there was a large segment of the public for whom it was not a good system.
The 1981 Chilean pension plan had been hailed in Latin America as a system that could not easily go bust because every single pension had to be related to a defined contribution paid during the decades spent in the job market by a single worker. For full career people with a lifetime steady job it is a pretty good system, but that’s a small minority of the Chilean public. The system is now extraordinarily unpopular in Chile, Diamond explained.
“What can you do to get a system that works better in advanced countries?”, Diamond asked. Sweden went through an overhaul in the mid-1990s. Economists in Sweden and Italy simultaneously invented the concept of a notional defined contribution system – a brilliant idea, Diamond said, “[a] very clever way of dealing with some of the risk issues.” For example, since 1996 Italy has no longer calculated pensions with the old pay-as-you-go system but with the defined contribution system. On the positive side, this will make the pension system more financially solid. On the negative side, it will cut most pension checks compared to the previous pay-as-you-go system.
Diamond concluded by saying that “there is no single best pension system for everywhere”. For him, this is the core message: pension design is politically difficult because there are diversities in population, family structure, and labour market experiences, and also because the whole fabric entails economic and demographic risks.
Individual papers: Learning the hard way
A common thread linking most papers was an evaluation of the impact of the Global Crisis on theory and models. “Many things have changed since the Great Recession”, said Pierre Collin-Dufresne (Ecole Polytechnique de Lausanne), discussant of the theoretical paper presented by Monika Piazzesi (Stanford University) on “Payments, credit and asset prices”. The paper focuses, among major changes, on the unprecedented growth in central banks’ balance sheets and in private banks’ holdings of central banks’ reserves, as well as on the persistently low inflation. As one of the consequences, Monika Piazzesi and her co-author Martin Schneider assess how monetary policy affects asset and goods prices in this new environment.
The housing market and the effects of negative interest rates on bank lending were the two subjects dealt with by Arvind Krishnamurthy (Stanford University), with a theoretical paper on “Mortgage design in an equilibrium model of the housing market”, and Glenn Schepens (ECB), who dealt empirically with “Life below zero: bank lending under negative policy rates”.
The recent US financial crisis featured elevated housing market volatility and widespread household default. How can mortgages be redesigned to avoid amplifying housing market busts? In his presentation, Krishnamurthy focused on mortgage designs that index payments to the aggregate state of the economy. If the central bank lowers interest rates, mortgages that index to short-term interest rates – such as adjustable rate mortgages (ARMs) – provide insurance benefits in a crisis by reducing payments, which smooths consumption, stimulates purchases by new homeowners, reduces default, and short circuits a price-default spiral.
ARMs do, however, have drawbacks if real rates rise in a downturn. Therefore, the paper finds that mortgage designs that front-load payment reductions to provide maximal relief to constrained homeowners in a crisis perform better than designs that spread the benefit over the life of the mortgage.
In his paper, Glenn Schepens studied empirically the benefits and costs of one particular type of unconventional monetary policy – namely, negative policy rates – for banks and for the real economy. The main finding is that the impact of negative policy rates depends heavily on the bank’s funding structure. Banks that finance themselves with retail deposits reduce overall lending (measured as the amount of syndicated loans they extend to firms) but increase risk-taking (measured by the volatility of the stock market returns of the funded firms).
Moreover, the effect on risk-taking is driven by a switch away from relatively safe borrowers towards previously unfunded riskier borrowers; the latter in turn increase their rates of capital investment. By affecting both bank lending and bank risk-taking, unconventional monetary policy has implications both for financial stability and for real growth.
Empirical work at the conference included a presentation by Luigi Guiso (Einaudi Institute for Economics and Finance) on the first comprehensive analysis of the heterogeneity in returns on investment across individuals. The analysis (“Heterogeneity and persistence in returns to wealth”) was motivated by the observation that wealth inequality has been persistently increasing over time, to a point where the top 0.1% of the population hold around 20% of all wealth in the US, compared with 10% several decades ago.
Economic theory has argued that persistent heterogeneity in the returns on assets can partly explain these developments, in the presence of a positive correlation between wealth and the ability to generate financial returns. Overall, the findings suggest that the ability to generate returns on investment has a comparable contribution to an individual’s lifetime income to the accumulation of human capital. Thus, the paper has important implications for the policy debate on measuring trends in wealth inequality and the optimal taxation of capital income and wealth.
Another empirical paper, “The elusive costs of inflation: price dispersion during the US Great Inflation”, presented by Emi Nakamura (Columbia University), assesses the costs of elevated inflation in the US in the 1970s due to inefficient price dispersion, using a new dataset of individual prices. The results suggest that the standard New Keynesian analysis of the welfare costs of inflation needs to be reassessed.
The conference had a rich menu of theoretical papers. Pol Antràs (Harvard University) presented “Globalization, inequality, and welfare”. Most models of international trade operate under the assumption that the welfare gains from trade can be redistributed without costs across agents, compensating those adversely affected. The paper relaxes this assumption, by realistically assuming that redistribution is limited and occurs via distortionary income tax-transfer systems, and proposes that welfare gains from trade should be corrected for any increase in inequality. Trade-induced increases in inequality of disposable income erode about 20% of the gains from trade, while the gains from trade would be about 15% larger if redistribution were to be carried out via non-distortionary means.
The Great Recession was a deep downturn with long-lasting effects on credit markets, labour markets, and output. A particularly puzzling aspect has been the slow US recovery, with GDP persistently below its pre-crisis trend, a phenomenon dubbed ‘secular stagnation’. Laura Veldkamp (New York University) presented, “The Tail that Wags the Economy: Beliefs and Persistent Stagnation”, proposing a simple mechanism to account for slow recoveries after large, negative shocks. It rests on the premise that no-one knows the true distribution of shocks to the economy. If economic agents use observed macro data to estimate this distribution non-parametrically, then transitory events – especially extreme events – generate persistent changes in beliefs and thus in macro outcomes. In a model designed to explain the onset of the Great Recession, this new mechanism can endogenously generate secular stagnation.
John Leahy (University of Michigan) presented a paper about information and choices in economic matters, from insurance plans, to mortgages, to more complicated choices. Nobody has easy and automatic information and acquiring it is difficult. In “Rational inattention and inference from market share data” the aim is to study the evolution of market shares when agents freely observe past shares and also engage in costly private information acquisition. The analysis of steady-state behaviour in particular opens the doors to analysis of market behaviour and policy, and to issues of inference from suitably rich data.
Panel discussion: The way out
The panel discussion focused on “Exit from non-standard monetary policy” and, in particular, exit from so-called quantitative easing (QE), a theme clearly relevant for the ECB, which has engaged in a large programme of asset purchases. The panel was chaired by Benoît Cœuré (member of the Executive Board of the ECB) who brought the perspective of a central banker to the discussion, and included Olivier Blanchard (Peterson Institute for International Economics), Hyun Song Shin (Bank for International Settlements), and Jeremy Stein (Harvard University).
What will happen with the phasing out of QE? During the years of low interest rates and up to now, the market has been relying on long-duration and very long-duration safe assets. A reverse preference for shorter-term and short-term assets is predictable. But how big and how fast? “It is not going to be a big deal, but I do not want to be overly complacent”, was Stein’s bottom line in the follow-up to his speech. Shin’s final assessment struck a more cautious tone, concluding a lively session of questions and answers by stating that “markets can overreact”. Blanchard, summing up his own assessment, was even more cautious: the market seems to expect very low rates even in the process of the phasing out, he said, but “my view is that there might be a risk of much higher ones” quite different from the widespread expectations.
While Stein tackled several issues related to the liabilities side of the central bank’s balance sheet, Shin discussed the impact of large-scale asset purchases on financial markets and investors’ behaviour. Blanchard wrapped up and came back to the question of how central banks should exit from non-standard measures.
Stein focused on the Federal Reserve System. As the Fed is starting to run QE in reverse, it is natural to ask what happens when you start liquidating all those assets. Stein talked about an important coincidence: the QE era was a monetary policy effort, but it happened to coincide with fairly dramatic changes in financial markets, and in regulation in particular, as a consequence of 2008. The point he made was that some of these regulations (liquidity coverage ratio, etc.) were creating greatly increased demand for various safe assets. The Fed, by massively increasing reserves, was creating a certain kind of safe asset, which greatly benefitted market functioning. Things went smoothly because of this almost inadvertent supply effect that was just the balance sheet mirror image of QE.
“Now as we start to unwind, one ball that you want to keep your eye on is what’s happening there”. The Fed’s balance sheet is $4.5 trillion. On the asset side: $2.5 trillion of Treasuries, $1.8 trillion of mortgage-backed securities and some bits and pieces. Less attention is given to the liabilities side: $1.6 trillion of currency, $2.3 trillion of reserves and again some other stuff like Treasury deposits and a smaller repo programme. Interestingly, Stein argued, the Fed has been quite precise about the near-term trajectory, but has not called out the endpoint yet. That’s smart. But the balance sheet could shrink by $1.5 to $2 trillion. Nothing remarkable is going to happen, according to Stein, for about a year from now. Then we should witness some movement as rates go up, particularly lively demand for short-term assets.
“So again my best guess is that nothing too exciting will happen for the next year or so but then after that I think it’s at least an open question, and it would be good to keep a little bit of optionality available.”
Shin explained why long rates seem to be so resilient in the face of monetary policy normalisation. Long rates in pretty much all the jurisdictions are where they were this time last year. They got a lot lower in the middle of the summer of 2016 after the Brexit referendum, when long rates in some jurisdictions went negative. We are now one year on from there, and yet long rates haven’t really budged that much. One line of argument is that markets are far-sighted. “We tend to take market prices as signals and so if prices are not reflecting monetary normalisation well, do the markets know something that we don’t?” But, according to Shin, assigning too much wisdom to the markets might be too much of a bet.
An assessment of the investment behaviour in the core Eurosystem countries, and in particular the portfolio created by German insurers compared to 2008, can be useful, according to Shin. Compared to just before the crisis, holdings by German insurers of ultra-long sovereign bonds have more than quadrupled, and this has coincided with a decline in the long-term interest rate. When interest rates rise, it is possible that the opposite behaviour – that is, growing demand for short-term bonds – might get the upper hand. The market does not seem to have factored in this development up to now, but nobody can rule it out. It all will depend on how smooth this transition is when it happens, was Shin’s conclusion.
Blanchard focused on the issue of what the ultimate size of the central bank balance sheet should be. “Jeremy said that the Fed is smart not to tell exactly where it’s going to go, and I could see the point, which is if you don’t exactly know how things are going to work out you don’t want to commit in advance.” But it seems a bit strange to start going in some direction, observed Blanchard, without indicating where you are going to end up.
One important point is to determine what the optimal size of a central bank balance sheet should be. How much smaller? According to Blanchard, in the scenario of the very gradual exit from QE the Treasury has to be brought in because, fundamentally, balance sheet operations of the central bank are government debt management and, in the end, what the public holds is determined by the decisions of a central bank and the Treasury together. If there will be growing demand for short-term bonds, reversing the trend since 2008, the Treasury should mostly see to it. So, coordination is needed between the central bank and Treasury. This much easier in the US, but more complicated in the euro area, “where we have one central bank and 19 treasuries which do not even now coordinate”.
Blanchard then sketched five different scenarios of what might happen and suggested that only the one indicated by Stein – that is, strong demand for short-term bonds going together with an increase in central bank rates – seems credible. This does not guarantee, of course, that the transition will be smooth, and that the bond rates will not grow in a robust way. And since the treasuries should be called on to ease the transition, the fact that fiscal implications are bigger in the euro area should not be underestimated.
Editors’ note: The full conference programme, and links to all of the papers presented can be found here.
Antras, P, A de Gortari and O Itskhoki (2017), “Globalization, Inequality and Welfare”.
Caplin, A, J Leahy and F Matejka (2017), “Rational Inattention and Inference from Market Share Data”.
Fagereng, A, L Guiso, D Malacrino and L Pistaferri (2017), “Heterogeneity and Persistence in Returns to Wealth”.
Guren, A M, A Krishnamurthy, and T J McQuade (2017), “Mortgage Design in an Equilibrium Model of the Housing Market”.
Heider, F, F Saidi, and G Schepens (2017), “Life Below Zero: Bank Lending Under Negative Policy Rates”.
Kozlowski, J, L Veldkamp and V Venkateswaran (2017), “The Tail that Wags the Economy: Beliefs and Persistent Stagnation”.
Nakamura, E and J Steinsson (2017), “The Elusive Costs of Inflation: Price Dispersion during the U.S. Great Inflation”.
Piazzesi, M and M Schneider (2017), “Payments, Credit and Asset Prices”.