265
Views
2
CrossRef citations to date
0
Altmetric
Financial Crisis Symposium

Implications for models in monetary policy

Pages 429-444 | Published online: 10 Dec 2010
 

Abstract

Monetary authorities have been implicated in the financial crisis of 2007–2008. John Muellbauer, for example, has blamed what he thought was initially inadequate policy responses by central banks to the crisis on their models, which are, in his words, ‘overdue for the scrap heap’. This paper investigates the role of monetary policy models in the crisis and finds that (i) it is likely that monetary policy contributed to the financial crisis; and (ii) that an inappropriately narrow suite of models made this mistake easier. The core models currently used at prominent central banks were not designed to discover emergent financial fragility. In that respect John Muellbauer is right. But the implications drawn here are less dramatic than his: while the representative agent approach to micro-foundations now seems indefensible, other aspects of modern macroeconomics are not similarly suspect. The case made here is rather for expanding the suite of models used in the regular deliberations of monetary authorities, with new models that give explicit roles to the financial sector, to money and to the process of exchange. Recommending a suite of models for policy making entails no methodological innovation. That is what central banks do; though, of course, how they do it is open to improvement. The methodological innovation is the inclusion of a model that would be sensitive to financial fragility, a sensitivity that was absent in the run-up to the current financial crisis.

JEL Codes::

Notes

 1. There are a number of excellent accounts of the crisis, such as Blanchard (Citation2009) and Almunia et al. (2009) and recently Roberts (Citation2010) with a focus on the changes in financial sector incentives. The text focuses on those aspects most closely related to monetary policy and, consequently, misses a lot of the detail (especially of an institutional kind) in these broader accounts.

 2. This was one of the less widely shared notions in the modern consensus on monetary policy. Or to be more precise it was one of the elements of the consensus which many monetary economists and central bankers felt was more open to further research and experience (Goodfriend Citation2005; Mishkin Citation2007).

 3. Note that Bernanke and Gertler (Citation1999) did not recommend policy paralysis; on the contrary they reasoned that ‘asset price crashes have done sustained damage to the economy only in cases when monetary policy remained unresponsive or actively reinforced deflationary pressures’ (Bernanke and Gertler Citation1999, p. 18). And it seemed that the required response was precisely the response indicated by a flexible inflation targeting system.

 4. While ‘monetary policy’ typically means the interest rate policy of a modern central bank, there are other instruments available to central banks, including reserve requirements, quantitative adjustments to the monetary base and usually a role in the regulation of financial sector firms (Borio and Disyatat Citation2009).

 5. Blinder (Citation2005) argued that the ‘mop up’ approach passed a ‘severe’ stress test with the unwinding of the dot.com bubble from 2000 to 2002.

 6. For Mishkin (Citation2008) a prerequisite for taking policy action against even a credit bubble is that the bubble reflects market failure of some kind.

 7. William Poole – then chief executive of the Federal Reserve Bank of St Louis – observed the same in 2007 (Poole Citation2007).

 8. The awareness of the data constraints under which policy is made in real time is hardly new, and the point has been made repeatedly in the post-War period, by, to name just a few: Friedman (Citation1947), Meltzer (Citation1987) and McCallum (Citation1994) and more recently by Orphanides in a series of papers, including Orphanides (Citation2002).

Two important lessons have emerged from this literature (Orphanides Citation2002, p. 606): first, the evaluation of past policy must be sensitive to the data available in real time to the policy makers. For the case at hand this means looking at the Fed's best estimates of output and inflation in the critical years between 2001 and 2005, not an evaluation of what seems optimal with the benefit of the revised data. Second, realistic policy alternatives have to be based on data that is both available and measurable with a high signal-to-noise ratio. From this perspective it is not useful to examine the merit of alternative policy paths in 2001 to 2005 based on the assumption that policy makers could have known that an asset bubble was emerging in the US housing market.

The quantitative importance of noise in the data on the merit of alternative policy paths was demonstrated through simulation by Orphanides (Citation2002). He contrasts the substantial gains that appear within the grasp of an activist monetary authority when they operate with noiseless data, with the substantially inferior results generated by the same activism when the relevant data is noisy. Indeed with noise a much less activist policy rule delivers substantially better stabilisation for the economy. The same result, favouring a ‘conservative’ central banker, emerges when the monetary authorities grapple not only with noisy data but with uncertainty about the transmission mechanism and expectations (Orphanides and Williams Citation2008).

 9. These ‘nowcasts’ are released with a five-year time lag, allowing consideration of the critical 2002 to 2005 period at the time of writing.

10. John Muellbauer (Citation2007) argued that the Fed's concern with the risk of deflation was a misplaced fear that the Japanese episode of the 1990s could be repeated in the USA.

11. There were many other factors such as regulatory mistakes or even ‘holes’ in financial regulation as well as psychological factors such as overoptimism that played a role in the increased leverage and perverse incentives that encouraged risk taking in the banking sector (Blanchard Citation2009; Roberts Citation2010).

12. Models used by central banks in the policy process will usually disaggregate this IS relationship in a number of constituent relationships by different types of real expenditure (Goodhart Citation2005, pp. 3–4).

13. As Sims (Citation2002) showed, the policy process at leading central banks uses model results combined with non-model information about the economy, from surveys and other data, as well as subjective judgments.

14. While they specify a continuum of households their aggregation is trivial due to the assumptions of separable utility and complete contingent markets (Fernández-Villaverde et al. Citation2010, p. 6).

15. That is to say without ‘sound justification’ (Fernández-Villaverde et al. Citation2010, p. 5).

16. Identifying the key institutions (those where stress and failure will have consequences beyond the ‘limited group of customers and counterparties’ (Crockett Citation1997, p. 10) is difficult in practice. The recent crisis suggests that well-informed central banks can err in this judgement. Nevertheless, reforms under consideration for financial regulation internationally emphasise the position of ‘systemically important institutions’.

17. Important papers in this literature include: Tsomocos (Citation2003a; Citation2003b) and Goodhart et al. (Citation2004, Citation2006).

18. This discussion draws on Bårdsen et al. (2006, pp. 9–13), but the criteria envisioned for an adequate model here differ notably from his list.

19. Heterogeneity in the banking sector is critical for a model of financial fragility: in a representative agent model either all banks (the bank) will fail, or not, while the feature of financial fragility that is of interest here is the failure of some banks, while others survive, perhaps in a more fragile condition. Further there is no possibility of an inter-bank market (the source of much contagious interaction between banks and prominent in this regard in the present crisis) without heterogeneous banks (Goodhart et al. Citation2006).

Some DSGE models have incorporated financial friction by way of a financial accelerator, in which a representative agent model with asymmetric information generates balance sheet effects for the firm through one-period stochastic optimal debt-contracts with costly verification. Though this is a successful strategy to include balance sheet effects, it does not create any role for banks in the model, nor is there any room for regulatory policies in a framework where the equilibrium is always constrained efficient (Goodhart et al. Citation2010, p. 2).

20. Goodhart et al. (e.g. Citation2006 and Citation2004) build on earlier work in this direction by Dubey, Geanakoplos, and Shubik (Citation2000) and Shubik and Wilson (Citation1977).

21. The RAMSI model under development at the Bank of England (discussed below) is an example of the microeconomic foundations envisioned here.

22. Hoover (Citation2010) identified three distinct programmes to develop micro-foundations for macroeconomic models, which he called (i) the aggregation programme; (ii) the general-equilibrium programme; and (iii) the representative-agent programme. The micro-foundations described above are closer to the heterogeneity that Hoover identified in Keynes and the aggregation programme of Klein.

By contrast, the micro-foundations used almost universally in modern macroeconomics are of the representative agent kind, on the shortcomings of which see Hoover (Citation2008, Citation2009 and Citation2010). The tremendous problems of aggregation and the problems of working with the utility function of a representative agent as if the connection to the utility functions of individuals is not deeply problematic (and perhaps insurmountable) have never been answered (Hoover Citation2010). Under these circumstances the widespread adoption of representative agent models has the character of an ideological response (Hoover Citation2009).

23. There are many reasons for this, including the belief by policy committee members that the additional information gives them a more accurate picture of the current state of the economy and hence a superior basis for their short-term forecasts of key variables; in Sims's words, ‘several of those involved in subjective forecasting, at more than one central bank, expressed the view that the advantage of subjective forecasts is almost entirely in getting the current and the next quarter right’ (Sims Citation2006, p. 21). The desire to respond to large and unusual events is another reason for the persistent use of the non-model information. A third reason is the belief that no single model captures the range of issues relevant to a MPC (Vickers Citation1999).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 315.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.