433
Views
8
CrossRef citations to date
0
Altmetric
Articles

The unit root problem: Affinities between ergodicity and stationarity, its practical contradictions for central bank policy, and some consideration of alternatives

Pages 339-363 | Published online: 10 Apr 2018
 

ABSTRACT

The initial focus in this article is the problem of mismatch between policy goals and statistical analysis, based on how data is transformed and processed. This intrinsically raises ontological issues regarding the nature of an economy within which policy is made and to which statistical analysis is applied. These are of general significance to post Keynesians irrespective of the position they take on the specifics of the ergodicity debate. However, they involve some issues that overlap with some aspects of that debate. The problem as posed in this article is specific and involves a practical contradiction regarding central bank policy and the problem of unit roots. The authors then consider some additional ways in which one can go beyond common practice based on the example of Forward Guidance in the United Kingdom and a more institutional approach to post Keynesian analysis.

JEL CLASSIFICATIONS:

Notes

1To be clear, it is not our intention to extensively engage with or take a definite position on the ergodicity debate, its centrality to post Keynesian thought and its roots in Keynes’ work. For this, see the O’Donnell and Davidson exchanges. One might also gain context from King’s history of post Keynesian thought (King Citation2002). Davidson (Citation2005) also has a position on this and a set of reservations regarding Babylonian method (Dow Citation2005) and, inter alia, critical realism’s critique of axioms via formalism. Davidson argued for small tent post Keynesianism initially focused around: an economic system as a process moving irreversibly through calendar time, subject to a key role for expectations under ontological or real-world uncertainty and with reference to core theoretical commitments to an essential difference between financial and real capital, and where income effects dominate substitution effects. It is with reference to irreversibility and uncertainty that he rejects ergodicity.

2Álvarez and Ehnts (Citation2016) provided a more substantive argument for the meaning of ergodocity in mathematics—it is a concept in a mathematical system and is, therefore, used in a different sense in economics regarding real systems, and this can be confusing. Regarding Samuelson, as many have noted, he was usually careful to acknowledge models are limited, assumptions may be unrealistic, and one should establish, rather than assume, that findings and claims are transferable to real world conditions. As a polymath he was interested in all aspects of economics, including history of economic thought. Still, it remains the case that he creates the template for less careful and able economists to absorb a mindset. His more circumscribed work is translated via textbooks (based on a template he himself created, see Zuidhof Citation2014) and later literature and imitators. For example, his comments on ergodocity in the often-cited paper on classical and new classical monetary theory, note there are problems with the assumptions applied, but he still orients on ergodocity to avoid “hystericity” and to enable the possibility of generalizable claims. As such, the probabilistic underpinnings become the defining feature of law-like states expressed through ergodocity, and this in turn becomes a key constituent in what makes economics a science—its formal expression and stochastic range (see Samuelson, Citation1968, pp. 12–13).

3Two issues after the special section in Journal of Economic Perspectives focused on Angrist and Pischke’s claim for a credibility revolution (via random control trials, etc.) Caballero (Citation2010) notes that the dominant attempts to respond to problems in modelling and statistical technique continue to marginalize a concern with the real world - this remains a peripheral matter. This is also implicit in Leamer’s response to Angrist and Pischke, updating his con argument from the 1980s. Much of the change is technical sophistication as camouflage rather than genuine attempt to master the problem of context for the dataset. It is notable that American Economic Association journals provide some recognition of core problems. Mainstream economics is not without change. However, critique continues to be partial and limited in effectiveness and scope. For example, there is much to admire in Paul Romer’s recent critique of macroeconomic theory, but if one also notes the previous paper on “mathiness” (Romer Citation2015), it is also clear there is something inconsistent and partial in his position: Mathiness is ideological use of mathematics. He ascribes this to Robinson’s critique in the Cambridge Capital Controversy. An odd comment, given the argument was about the consistency between formal constructability and logical entailment; and Cambridge Mass conceded the actual argument.

4Note, the point of this article is to expose a methodological problem rather than to provide a detailed and sophisticated analysis of the technical aspects of statistical procedures. The focus is referenced to unit roots. If space allowed one might also distinguish issues regarding other reasons for nonstationarity, such as deterministic trends, which produce a more defined period of non-mean reversion. Clearly, the problem also involves distinctions (by degree and source) between stochastic process where a random variable affects evolution and there is some degree of indeterminacy of outcome, from more deterministic forms where evolution can only follow one pathway. The systemic implications are basic to the Davidson-O’Donnell debate, but again we are concerned with affinities, whilst taking a slightly different perspective.

5The problem is broader in terms of the way statistical approaches have responded to internal critique. For example, in his well-known critique of Leamer’s extreme bound analysis solution to the con in econometrics, Sala-I-Martin (Citation1997) noted that the initial model specifications may involve spurious fits due to endogenous variable effects. Endogeneity is, of course, a core issue for post Keynesians.

6Similarly, as any general econometrics text will note, a series will be integrated of order d if it is non-stationary but is Δd if stationary (e.g., Asteriou and Hall, Citation2016).

7Clearly, whatever one eventually accepts in terms of ergodicity, one cannot evade Keynes’ initial concerns here. Keynes’ work may or may not be explicable or translatable in ergodic terms, subject to definition, but beneath this is the problem of deriving and justifying meaningful constant coefficients and related issues of model specification. The ongoing issue has become how problems are managed and accommodated to within economics as statistical practice, and what is achieved by doing so. This was basic to the Cowles Commission project (the antecedents of which initiated Keynes’ concerns) and also to the more interesting attempts to reconcile issues. In addition to the Nell and Errouaki example referred to later, one might also note Aris Spanos’ (e.g., Spanos Citation2015) various papers setting out the development of the LSE tradition building from Haavelmo (and sharing some commonality with Hendry’s work; see more generally Keane Citation2010).

8One should not also that differencing can lead to further issues including loss of information central to the theoretical model (see Freedman, Citation2009). The error process can also be differenced, which then produces a non-invertible moving average error process, leading to issues with estimation (Plosser and Schwert, Citation1977). Furthermore, if we difference the variables the model can no longer give a unique long-run solution (see Asteriou and Hall, Citation2016).

9Ultimately, this must extend to ontological issues, though one may not like the term and one need not be an adherent of any current specific form of ontological position in economics. It is worth noting that the most self-aware attempts to justify the adequacy of statistical analysis first ask: what is the nature of the economy that we then seek to analyze? One need not agree with the mainly negative position on mathematics often attributed to Lawson to recognize the primary nature of this question. Arguably one must have a response to Lawson’s critique to have a basis to proceed. The range here has been wide in economics (see later examples, but also Downward Citation2003).

10At root the dataset which is supposed to be representative of the data generating process that we assume we can know has manifested as nonstationary, and thus nonmean reverting and nonhomogeneous. Intuitively this leads to the inference that the process generating the data is irregular to a degree that results in this first order non-homogeneity. For investigation to be meaningful there must be something that then arises within the data generating process to enable the application of statistical analysis (which can then be deemed appropriate). This is where a great deal of dispute has arisen. To what degree and in what sense is there an opaque order? It is a primary point recognized at the beginning of introductory econometric textbooks that stationarity is a step in processing data to enable the disentangling of causal factors according to econometric techniques (econometricians typically speak of “well-behaved data”). Unfortunately, this usually and quickly gives way to a focus on various procedures to process data and test the processing: linearity, normality etc. This becomes an introverted focus on what is appropriate to statistical analysis. This point of departure is subtly different than is the statistical analysis appropriate for its point of reference. This in turn has governed a great deal of the recognition of problems (alchemy and the con) and has, thereafter, structured proposed solutions (e.g., distribution-seeking iterative work along the lines of Hendry).

11To be clear, the only point being made here is that central banks do have inflation targeting and a price stability remit. This is not a comment on the adequacy of price stability measures or of the real role of central banking in contemporary financial architecture (are they, for example, forced to expand reserves to respond to private banks’ capacity to create money through credit creation from which deposits arise?). Davidson (2016) also has a position on this (Citation2006).

12Consider this in terms of forecasting: The Bank of England has been using ARIMA modelling. A primary condition of ARIMA modelling is stationarity. If inflation targeting necessarily leads to a non-stationary data series, but the model forecasting future inflation levels requires the inflation series to be stationary, a potential weakness is built into the modelling procedure.

13To be clear, Rosser’s assessment of Davidson’s argument is reasonable. However, because it is necessarily focused and attempts to draw together a great deal of material in a concise form based on a journal article word limit, it also has absences. Rosser puts aside critical inquiry into the standard practices and attitudes he refers to. He provides no comment on the actual prevalence of nonhomogeneity, nonstationarity, and nonergodicity. He recognizes that the reference point for these is also conditions and possibilities of the world over and above further considerations of subjective-objective probability in terms of the nature of that world. One could make more of this. Rosser may be correct to note, as O’Donnell suggested, there is “no guarantee at all of being able to really determine whether or not one is dealing with an ergodic system” (Citation2015, p. 339). However, this is not a warrant for methods that assume homogeneity, stationarity, and typically ergodicity. It generates a burden of proof issue for the adequacy of methods, rather than a reasonable presumption that one can adopt assumptions and so use standard practices, such as transformation via differencing. This is especially so where the methods impose more restrictive assumptions about reality that observation seems to render questionable. One might also note that Davidson seemed reasonably justified in claiming O’Donnell’s critique is tendentious. Although the technical issues that arise are not wrong per se, he approached Davidson’s position rather unsympathetically in search of contradiction rather than considering the general point at issue. O’Donnell dichotomized where polar positions do not necessarily apply. However, Davidson also contributed to the problem of meaningful communication by periodically shifting the terms of debate without recognizing this is what he is doing.

14Such an identity would be difficult to interpret. It might imply that policy interventions today can only delay, but not change the long run optimal solution already predetermined by free markets (Davidson Citation2012), so the identity was an epiphenomena of a deterministic system. This would further indicate that no additional causal intervention has occurred or is relevant. This is also odd in terms of symbolic logic since it both seems to entail and not entail a unit root ordered system.

15These are conceptual, not only technical, and so different than issues addressed as essentially technical: for example, time homogenous properties in Markov Chains (“memorylessness,” etc.). Markov introduces new ways in which stochastics are posed. In quantitative finance this has led to new models for asset prices. A lot of this is problematic based on a general equilibrium framework.

16How successful this can be, of course, remains a matter of dispute. Further innovative approaches to the problem are to be found in philosophy of statistics. For example, Gillies (Citation2000) provided an excellent overview of the history and grounding assumptions of varieties of probability in order to advocate pluralism. One of the positions he sets out is propensity probability, which has been taken up by a variety of scientific realists working mainly in sociology and social policy, but also economics. Case-based propensity seeks or constructs datasets for states in relations (such as medical records). Because the approach is based on actual cases that can be recorded through time in different categories the approach is different than frequency or sample approaches, and so does not require a tacit representative subject or default to requirements of a qualitatively invariant aspect of the social world (at least in the sense implicit to samples based on frequency constructs). Techniques like cluster analysis can explore the dataset under different classifications, and so changes of state for relations can be expressed through time. Advocates in general tend to view social measurement as clues to mechanisms for abduction, etc. For innovative work and argument in the United Kingdom, see the four volume edited collection Olsen (Citation2010), and also Byrne and Ragin (Citation2009).

17The Fed introduced something different in January 2012: a dot plot of each member of the Federal Open Market Committee’s projected fed funds rate for the next 3 years, deemed necessary to achieve stated economic goals. This graphs Federal Open Market Committee’s sentiment for analysts to then interpret.

18However, it was the U.K. Treasury that received most of the initial criticism. Its initial medium-term forecast was that Brexit would reduce GDP by 3.6% over the first 2 years, and its longer-term steady state comparison claimed that the United Kingdom would be 3.4% to 9.5% worse off outside the EU in 2030. The former assumed an immediate trigger of Article 50, and a sharp reduction in consumption and business revenue.

19For example, the Bank of England commissioned Fed researcher David Stockton in 2012 to produce an analysis of the Bank’s forecasting performance. This has also been followed up by Bank commissioned research by Andrew Wood at University of Essex in 2013.

20As Haldane stated, “To pinpoint the specific contribution made by monetary policy, we need a model of the economy. By using the Bank of England’s suite of macro-economic models, we can quantify the specific role played by monetary policy in explaining movements in incomes and jobs, albeit rather imperfectly given that all models are imperfect” (Haldane, Citation2016, p. 9).

21An associated language of upward and downward revisions gives the impression the problem is always data absence—or some statistical error that is later accommodated. This is quite different than accepting the model cannot be what it is constructed to be.

Additional information

Notes on contributors

Muhammad Ali Nasir

Muhammad Ali Nasir is with the Economics, Analytics & International Business Group, Faculty of Business & Law, Leeds Beckett University.

Jamie Morgan

Jamie Morgan is with the Faculty of Business & Law, Leeds Beckett University.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 231.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.