Modern macro models are designed to assess regimes or rules that commit governments to a complete specification of policy choices as a function of their available information. But – perhaps fortunately – few policymakers are ever faced with making these kinds of sweeping changes. They are instead charged with making more episodic macroeconomic policy decisions (like how much to raise the Fed Funds rate or cut taxes). As a result, modern (post-1980) academic macroeconomics has not been of much practical use to economic policymakers. I discuss how to use insights from modern (post-1980) microeconomics – in both its theoretical and applied forms – to develop a scientific framework capable of addressing these kinds of policy decisions.
The scope of modern macroeconomics
Roughly 40 years ago, in the wake of the Lucas Critique (1976), a microfoundations revolution began in the discipline of macroeconomics. Academic analyses of macroeconomic policy came to be based on models with forward-looking utility-maximising consumers/workers and profit-maximising firms that interacted over time in (largely) competitive markets. As I write in 2018, thousands of papers have been written that follow this basic template. Thanks to the huge increases in computing power and micro-data accessibility over the past 20 years, current papers now use models that are remarkably complex in terms of their ingredients and in terms of their implications.
Yet, despite its dazzling intellectual accomplishments, this paradigm does not play much of a role in the policy world. This is readily seen by looking at the core macroeconomic models of the Congressional Budget Office (Arnold 2018: 5) or the Federal Reserve (Brayton et al. 2014). Larry Summers, who served as a key economic advisor during the first couple years of the Obama administration, summarised his own experience by saying in 2011 that, “the vast edifice in both its new Keynesian variety and its new classical variety of attempting to place microfoundations under macroeconomics was not something that informed the policy making process in any important way” (Summers 2011).
But this disconnect between the academic paradigm and the needs of policymakers shouldn’t be surprising. Robert Lucas laid the intellectual foundations for the modern approach in a series of papers in the late 1970s. In doing so, he argued that the impact of any particular one-off policy decision depended on how it affected the private sector’s beliefs about future policy choices. As a result, the government had to manage these expectations about the future, and, in Lucas’ words, “this limits the class of policies the consequences of which we can hope to assess in advance to policies generated by fixed, well understood, relatively permanent rules (or functions relating policy actions taken to the state of the economy)” (Lucas 1980).
The considerations stressed by Lucas have led modern academic macroeconomists with policy interests to focus on the comparisons of regimes or rules. These quasi-constitutions provide a complete description of how the government will make monetary and fiscal policy choices in all future dates and states. They leave no room for discretionary choices and therefore are not designed to help policymakers confront the more practical issue of “what is to be done today?”
This situation raises a natural question. How can modern macroeconomics be re-oriented so that it can be of more practical value to modern policymakers? In a recent paper, I suggest that the answer to this question lies in modern microeconomics (Kocherlakota 2018).
I begin by using modern theoretical microeconomics to frame the policy question.
Consider a policymaker who is making but one in a sequence of choices over time (such as whether or not to raise interest rates or cut taxes). With each decision, the policymaker seeks to maximise an objective function based on macroeconomic variables (like unemployment and inflation), and so is concerned about the impact of his/her policy choice on these variables. That macroeconomic impact is in turn shaped by how a forward-looking private sector responds to the policy action.
This description is that of a dynamic game between the policymaker and the private sector. Over the past forty years, economic theorists have developed powerful tools to analyse these games. A core prediction of the resulting theory is that, even though it has complete discretion at each date, the government finds it optimal to use a fixed strategy that maps its available information into an action. Thus, despite its discretion, the government ends up acting as if it is following a rule. The private sector can then base its predictions about the government’s future policy choices on its knowledge of this rule.
Admittedly, there is an apparent conundrum embedded in here – how can the government ever consider deviating from a rule without shaking the private sector’s confidence in that rule? The answer to this apparent paradox is that in most situations, the private sector knows that the government’s actions are in part based on information (about its objective or the economy) that only it sees. In those situations, the private sector attributes apparent deviations to realisations of that information, not to the policymaker’s changing its rule.
We can state this as a principle.
The safe-to-deviate principle: Suppose that the policymaker’s strategy is such that the private sector can attribute any apparent deviation as being due to a (possibly highly unlikely) realisation of the policymaker’s private information. Then, the policymaker can safely make any choice without affecting the private sector’s beliefs about future policy decisions.
Practical advice for policymakers
With this basic game-theoretic framework in place, we can now use some of the thinking in modern applied microeconomics to develop a systematic approach to making one in a long sequence of policy choices. More specifically, suppose that the policymaker has past data on:
- pre-determined (at the time of policy choice) economic factors (that from now on I’ll simply label X);
- policy choices;
- economic outcomes of those policy choices, X, and other unobserved shocks.
Suppose too that she believes that her past strategy satisfies the ‘safe-to-deviate’ principle. How should she best use her available data to make a decision today?
The first point – which is both obvious and usually disregarded – is that the policymaker needs only to predict the impact of her policy choice on one variable, namely, the value of her objective. There is no point to using up (scarce) data on trying to predict variables that don’t enter into the objective or to predict the separate behaviours of the variables that do enter the objective.
With that point in mind, the policymaker wants to use the past data to figure out which policy choice, given the currently observed X, will lead to the highest value for her objective. The safe-to-deviate principle provides confidence that the private sector’s response function is a stable one over time. But the policymaker still needs to be able to translate the observed statistical relationship between policy choices and objective values into a causal relationship. The applied microeconomics literature on causation (e.g. Imbens and Wooldridge 2007) tells us that she can do so if, conditional on X, the policy choice varies over time because of other factors that:
- have no direct influence on the relevant economic outcomes;
- are statistically independent of the other shocks that affect the relevant economic outcomes.
(Note that, if X is publicly known, we can associate this auxiliary variation in past policy choices with the policymaker’s private information that was discussed earlier.)
Given these conditions are satisfied, she can find the best possible current decision by:
- running a (possibly nonlinear) regression of objective values on past policy choices and X’s;
- making the policy choice that maximises that regression function, given the current X.
This regression function (of objective values on policy choices and X’s) is, in the language of modern public finance, a sufficient statistic (Chetty 2009) for the policymaker’s decision.
This approach is wholly nonparametric and so requires a great deal of data. If data are scarce, the policymaker may find it useful to impose credible restrictions on the functional form of the regression function. Economic theory – whether micro-founded or not – can be a valuable source of information about these restrictions.
Where do we go from here?
Academic macroeconomists continue to enrich their theoretical models and the data used to inform those richer models. There is, after all, no logical end to the demands of the Lucas Critique – no matter how rich a model is, there is always a policy change of interest that would require an even richer one.
But this process seems unlikely to make academic macroeconomics of more practical use. Instead, what is needed is to re-focus research and (even more importantly) instruction on empirically oriented questions like:
- Why have policy decisions varied in the past? Can this source of variation be plausibly viewed as having had little impact on the economy, except through the policy choice itself?
- What evidence supports (or doesn’t support) the ‘safe-to-deviate’ principle for a possible choice?
- What kinds of theoretically plausible a priori functional form restrictions can be imposed on the form of the regression function of interest?
These kinds of questions lie at the heart of much modern applied microeconomics. They should become more central to macroeconomic policy evaluation that is intended to be of practical value.
Arnold, R (2018), “How CBO Produces its 10 Year Economic Forecast,” Congressional Budget Office Working Paper.
Bayton, F, T Laubach, and D Reifschneider (2014), “The FRB/US Model: A Tool for Macroeconomic Policy Analysis”, Federal Reserve Note.
Chetty, R (2009), “Sufficient Statistics for Welfare Analysis: A Bridge Between Structural and Reduced-Form Methods,” Harvard University working paper.
Imbens, G, and J Wooldridge (2007), “Instrumental Variables with Treatment Effect Heterogeneity: Local Average Treatment Effects”, NBER Lecture 5.
Kocherlakota, N (2018), “Practical Policy Evaluation,” NBER Working Paper 24643.
Lucas, R E, Jr (1976), “Econometric Policy Evaluation: A Critique”, Carnegie-Rochester Conference Series on Public Policy 1,19-46.
Lucas, R E, Jr (1980), “Rules, Discretion, and the Role of the Economic Advisor”, in S Fischer (ed.), Rational Expectations and Economic Policy, Chicago: The University of Chicago Press, 199-210.
Summers, L(2011), “A Conversation on New Economic Thinking,” interview with Martin Wolf at the Bretton Woods conference.