410
Views
4
CrossRef citations to date
0
Altmetric
ARTICLES

Is neo-Walrasian macroeconom(etr)ics a dead end? An assessment of recent criticisms of DSGE models

Pages 441-469 | Published online: 21 Sep 2017
 

ABSTRACT

After the 2008 “new Great Crisis,” it is widely recognized that mainstream macroeconom(etr)ics—the last result of Lucas’s anti-Keynesian revolution of the 1980s, which tried to give macroeconomics sound neo‐Walrasian microeconomic bases —has failed to anticipate and then appraise the crisis. Has this crisis revealed a failure of this macroeconom(etr)ics as a scientific theory? Mainstream macroeconomists defend their models on the basis of their alleged superiority in terms of clarity and coherence. The thesis of this article is that this claim about superiority is false. The study argues that the reasons for the failure of mainstream macroeconom(etr)ics—in particular its poor predictive performance and interpretative weakness—reside in the implications of the neo-Walrasian legacy and the problems connected with the implementation of that program.

JEL CLASSIFICATIONS:

Notes

1We refer first of all, to Olivier Blanchard (Citation2009) who wrote that “The state of Macro is good” and Michael Woodford (Citation2009, p. 268) who maintained that “there has been a considerable convergence of opinion among macroeconomists over the past ten or fifteen years”. More generally, at the beginning of the new millennium, a general consensus emerged that the fundamental mechanism of macroeconomics was understood. See the optimistic declaration of Robert Lucas (Citation2003) in his presidential address to the American Economic Association and Ben Bernanke (2004) who celebrated the great moderation in economic performance over the previous two decades attributed in large part to the improved economic policymaking resulting from better understanding of how the economy works.

2On November 18, 2010, at the annual Central Banking Conference, Jean Claude Trichet, then governor of the European Central Bank, expressing what was considered a largely accepted opinion not only among heterodox economists, said that “macromodels failed to predict the crisis and seemed incapable of explaining what was happening in the economy in a convincing matter” —quoted again in Trichet (Citation2013, p. 244). Specifically, he was referred to Caballero (Citation2010). A bit paradoxically it was Lucas (2009) who first recognized that the neoclassical model was unable to predict the crisis. In fact, since 2009 a great number of papers and books referred to the interpretative and predictable limited ability of the contemporary neoclassical model: for example, Leijonhufvud (Citation2009), Buiter (Citation2009), Krugman (Citation2009), Caballero (Citation2010), Taylor (Citation2010), Stiglitz (Citation2011), not to mention all those books that accompanied Keynes’s resurgence, in particular, Clarke (Citation2009), Skidelski (Citation2009), Davidson (Citation2009), and Backhouse and Bateman (Citation2011). Rogers (Citation2013, Citation2014) are good examples of the further in-depth analysis of the shortcomings of the contemporary neoclassical macroeconomics.

3Agents are representative households, which maximize their utility under an intertemporal budget constraint, and firms, which maximize profits over time. The economy is affected by different types of exogenous shocks. The framework is designed to capture a plausible business-cycle dynamic of an economy (i.e., following an exogenous disturbance, the economy would return to the deterministic steady state rather rapidly, avoiding cumulative causation processes): it can be understood as an efficient response to those shocks.

4Reduced forms are obtained by solving the system of structural equations for the endogenous variables, that is, Y as the vector of endogenous variables to be explained by a statistical model, X the vector of explanatory exogenous variables, and e a vector of error terms, the structural form is f(Y,X,e) = 0 and the reduced form is Y = g(X,e), with f and g functions.

5Lucas (Citation1980) criticized the neoclassical synthesis because it rested on an old-style interpretation of Walrasian theory, where equilibrium was conceived as a static notion, acting as a center of gravity for disequilibrium states. He maintained that “the idea that an economic system in equilibrium is in any sense at rest is simply an anachronism” (p. 708).

6As is well-known, the assumption asserts that individuals use their information in an efficient way, without systematic errors, in the formation of their expectations. It does not deny that individuals can make forecasting errors, but it suggests that errors will not persistently occur, that is, the deviations from the pattern of correct forecasting are random. In fact, if they were systematically biased, they would be incorporated into the agents’ expectations. However, Muth himself argued that rational expectations would fail if people’s “errors” were to be correlated. So people could have different expectations but their differences should cancel out and if they do not, think of herd behavior, the rational expectations hypothesis is in trouble (we are indebted to Alan Kirman for this latter valuable point).

7This makes it possible to use the maximization postulate to analyze a world that is continually buffeted by shocks.

8This means that the theory does not live exclusively in a hypothetical world like Debreu’s world of general economic equilibrium. Macroeconomic models must reach practical conclusions.

9The order of integration I(d) is the minimum number of differences required to obtain a covariance stationary time series. A collection of time series is said to be cointegrated if their linear combination is I(0).

10In fact, when economic decision makers interact with each other, the outcome for the overall economy may be different from what was intended by the individual decision makers: a classic example is the paradox of thrift as formulated by Mandeville and then Keynes (see also Caballero, Citation1992; Kirman, Citation1992).

11Cross-equation restrictions have three main implications (Piazzesi, Citation2007). First, they constrain the rational expectation parameters to be consistent with the parameters from the equilibrium probability distribution, removing free parameters describing prerational expectations. Second, they tie together processes describing different endogenous variables that involve the same parameters and shocks, thereby increasing estimation efficiency for different data series containing information about the same parameters. Finally, rational expectations imply that the data-generating process underlying the agents’ beliefs is the same as the true data-generating process, hence justifying generalized method of moments (GMM) estimation on moments derived from the Euler equations.

12Identification problems arise whenever it is not possible to identify the best estimate of one or more model parameters. In particular, observational equivalence occurs when more than a single parameter set generates the same observed distributions.

13This is due to the ill-behaved mapping between the structural parameters and the coefficients of the solution, which means that the model transition laws are relatively insensitive to changes in many parameter values. See Canova and Sala (Citation2009) for an extensive analysis of the standard setting of a unique solution to the problem of the representative agent under rational expectations.

14The posterior distribution is proportional to the likelihood times the prior. Hence, in the case of variation-free parameters, the likelihood conveys information about the parameters whenever the prior and the posterior differ (Poirier, Citation1998). Conversely, in the case of parameter constraints, these shift the posterior away from the prior, hence differences between the prior and the posterior do not guarantee parameter identification.

15Hall (Citation1978) shows that unanticipated changes are unpredictable one-step ahead by rational expectations. Hence, structural breaks cause serious forecast errors.

16This theoretical attitude originated in the reaction to Marshallian economics in the 1920s and 1930s (Marchionatti Citation2003), and greatly influenced postwar mainstream economics, giving origin to what Weintraub (Citation1985) called the “neo-Walrasian research programme” (see also Backhouse, Citation1995). But its strong influence in macroeconomics dates from the 1970s.

17The concept of rigor we refer to is that introduced in the Hilbert–Menger (Karl)–Von Neumann–Debreu (and Arrow) tradition (see Marchionatti and Mornati, Citation2016). Its peculiarity is to consider rigor entirely divorced from empirical corroboration. In this construction, Karl Menger can be considered the point of departure in economics. His Citation1936 paper was the first instance in economics of a clear separation between the question of logical interrelations among various propositions and the question of empirical validity. According to Menger, it was the key point needed to transform economics into a science. This “clear separation” between the question of logic and the question of empirical validity, was described by Schumpeter (Citation1954, p. 1037) as “a shining example of the general tendency towards increased rigor that is an important characteristic of the economics of our own period.” Menger’s paper was at the basis of Wald’s work on the existence and uniqueness of GEE equation solutions, and of the program for the new mathematization of Walrasian general economic equilibrium theory. The axiomatic approach in economics was applied in von Neumann’s Citation1937 paper in a “totally coherent way,” in the sense that the concern for the economic interpretation of the model—still existing in Wald (Citation1936)—disappears. This theoretical attitude derived, first, from the fact that von Neumann dealt with the economic question as a mathematician: in this way he obtained a mathematical solution of (to quote von Neumann himself) a “highly generalised problem in theoretical economics” characterized by “the elegance of its solution, logical completeness, concision and rigor,” but he had to adopt “extremely artificial assumptions” or “idealisations” as von Neumann termed them. Second, von Neumann’s attitude derived from the fact that he dealt with theoretical economic problems like a formalist mathematician—that is, he conceived the model as a formal structure whose legitimacy and cogency depend on its internal consistency. The radical extension of formalism in economics definitively affirmed with Debreu (Citation1959) and the Arrow–Debreu model of the GEE. As Weintraub (Citation2002) wrote: “From Hilbert to von Neumann, to the Mengerkries and Wald, to Bourbaki and thence to Debreu runs the chain of causality, the development of modern economic theory in its unconcern to study real economies.” Debreu effected the complete shift to Hilbert’s definition of mathematical rigour: “demand,” “supply,” and “equilibrium” became nothing more referentially real than letter sequences. As a consequence it emerge the trade-off of rigor and relevance: “Allegiance to rigor dictates the axiomatic form of the analysis where the theory, in the strict sense, is logically entirely disconnected from its interpretations,” states Debreu, adding that such a dichotomy between the theory in the strict sense and its interpretation “reveals all the assumptions and the logical structure of the analysis” and “makes possible immediate extensions of that analysis without modification of the theory by simple reinterpretations of the concepts” (Debreu, Citation1959, p. x). This implies that its actual aim is not realism but understanding the implications of axioms and assumptions for the results. This is Lucas’s theoretical tradition and within it has to be considered the concept of rigor used by him (and implicitly assumed by neo-Walrasian macroeconomists).

18The attempt to incorporate ideas developed in other contexts (not mainstream) into mainstream economics, ignoring those contexts and thus leaving mainstream analysis essentially unchanged, is described by Palley (Citation2013) as “Gattopardo economics.”

Additional information

Notes on contributors

Roberto Marchionatti

Roberto Marchionatti is a professor of economics, Department of Economics and Statistics, University of Torino.

Lisa Sella

Lisa Sella is a researcher, CNR–IRCrES, and Department of Economics and Statistics, University of Torino.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 231.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.