158
Views
8
CrossRef citations to date
0
Altmetric
Original Articles

A new t-test for the R/S analysis and long memory in agricultural commodity prices

&
Pages 661-667 | Published online: 21 Aug 2006
 

Abstract

This article tests for long memory in daily and weekly agricultural cash price returns, using the modified rescaled range (R/S) test. A new corrected t-test is constructed for the R/S test to measure statistical significance properly. Empirical results indicate evidence of long memory in more than half of the agricultural commodities analysed. However, the values of estimated H statistics are less than 0.6, indicating relatively weak memory. The corrected t-test reduces type-I error for H statistics on the persistent long memory side and increases the power of the test for H statistics on the anti-persistent side.

Notes

 Brockwell and Davis (19xx) showed that when a weakly stationary process has short-term memory, its autocorrelation function is geometrically bounded; however, when the process has long-term memory, its autocorrelation function exhibits hyperbolic decay. Hyperbolic decay means that if x t  = Q(L)w t , then for any hypothetical rate r, the coefficient on L s in Q(L) is larger than r s , for all s greater than some sufficiently large S.

 The choice of sub-time series X i , T is explained by Corazza et al. (Citation1997) and Greene and Fielitz (Citation1977).

 If a time series has H > 0.5, then the series is less jagged than a random walk would be, implying that the series is less turbulent than a random walk, and therefore the series is called persistent. The opposite is true of a time series with H < 0.5. By considering the jaggedness of a series and its fractional dimension, it can be said that the Hurst exponent measures long-term dependence. As H approaches 1.0, the length and strength of the persistent behaviour increases, and a series plot becomes less jagged than a random walk. The opposite holds for anti-persistence as H approaches zero.

 Monte Carlo experiments were performed to test long memory in a random-walk series that is generated from an algorithm. Random-walk series with N = 1000, 2000, … , 10 000 were generated using an random number generator in GAUSS and then they were randomly scrambled several times. For each N, long memory test was performed using the modified R/S analysis. This procedure was completed several hundreds times and average values of

were computed. The E[
] for all the values of N were larger than 0.5. Specifically, most of the cases fell around 0.53.

 The case of standard normal distribution can be adopted as the null hypothesis, because it is mathematically easy to simulate, and its properties are well known.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 205.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.