171
Views
8
CrossRef citations to date
0
Altmetric
Original Articles

ARCH tests and quantile regressions

Pages 277-292 | Accepted 22 May 2003, Published online: 13 May 2010
 

Abstract

We consider a test based on quantile regressions to verify the presence of conditional heteroskedasticity. The test does not rely on distributional assumptions of the errors, nor on a function describing the pattern of heteroskedasticity. It compares the slope coefficients of the regressions computed at different quantiles. Under homoskedasticity, different regression quantiles yield parallel hyperplanes, and the slope coefficients are not significantly different from one quantile to the other. This is not the case when heteroskedasticity occurs. A Monte Carlo study is implemented in order to verify the behavior of this class of tests for conditional heteroskedasticity based on quantile regressions.

Notes

1However this should not be a problem, since quantile regression procedures are included in statistical packages.

2The ARCH process in Equation(1) is defined in the simplest form, as in Engle (Citation1982). Many generalizations have been proposed in the literature, like GARCH (Bollerslev, Citation1986), GARCH-M (Engle et al., Citation1987) and so forth. Although we do not take them into account, the approach we analyze can cover the different specifications, since we avoid any assumption about the functional form of ARCH.

3LAD coincides with maximum likelihood if the error density is a double exponential. The objective function is ∑ t=1,n |e t |, where the ρ(e t ) function coincides with the absolute value. LAD is more efficient than OLS when (2f(0.5))−2 < σ2, where f(0.5) is the height of the error density at the median, and σ2 is the variance of the errors.

4This is the major difference between our approach and the Koenker and Zhao (Citation1996) proposal. In the AR(p) model y t  = β0 + ∑ i=1,p β i y ti  + e t , Koenker and Zhao present the quantile regression estimator of the auxiliary equation defining the ARCH process e t  = σ t ϵ t  = (α0 + α1|e t−1| + ··· + α q |e tq |)ϵ t . Assuming sufficient conditions for the stationarity and ergodicity of y t , and provided a consistent estimate of the coefficients of the main equation, they prove the asymptotic normality of the estimated coefficients. The vector n 1/2(a(θ) − α(θ)), with a(θ) being the quantile estimator of α(θ), is asymptotically normal with zero mean and asymptotic covariance [θ(1 − θ)/f 2(F(θ)−1)]D 1 −1 D 0 D 1 −1, where D 0 = lim 1/n t=1,n Z t Z t ′, D 1 = lim 1/n t=1,n Z t Z t ′/σ t , and Z t is defined as Z t  = [1, |e t−1|, …, |e tq |]′. The model is extended to yield the contemporaneous estimate of the main equation and the auxiliary regression for an AR(1) process with errors following an ARCH(1). The objective function in this case is given by ∑ t=1,n ρθ(y t  − γ0 − γ1 y t−1 − γ3|y t−1 − γ2 − γ1 y t−2|), and the asymptotic behavior can be found in Jurechkova and Prochazka (Citation1994).

5Alternatively, the auxiliary regression could be defined as σ t  = α0 + α1|e t−1|, estimated by |e t | = α0 + α1|e t−1|, with objective function ∑ t=1,n ρθ(y t  − x t ′β) for the main equation and ∑ t=1,n ρθ(|e t | − α1|e t−1|) for the auxiliary regression, with normal equations

For this model the stationarity conditions can be found in Koenker and Zhao (Citation1996).

6For this model the stationarity conditions can be found in Engle (Citation1982). Here and in the other first order conditions of this section, we skip the RQ estimator for α0. This is done to simplify notation, since we do not really intend to estimate the auxiliary regression, nor its intercept. However, it would suffice to replace the scalar e t−1 2 with the vector [1 e t−1 2]. The same considerations apply for the RC and RCA models in the next paragraph.

7Lemma A2 is an adaptation of Lemma A3 by Ruppert and Carroll (Citation1980) to the case of stationary ARCH processes.

8By estimating the auxiliary regression which defines the conditional heteroskedasticity, Koenker and Zhao look at the significance of the parameters of the auxiliary regression. If all but the intercept are not significantly different from zero, the model is conditionally homoskedastic. The intercept differs from zero since it represents the constant part of the conditional variance.

9For further reference on RQ with non-identically distributed errors, see Hendricks and Koenker (Citation1992).

10Here we focus on the main equation alone, and we do not estimate the coefficients of the different ARCH models in order to avoid misspecification. This implies that the matrix Q coincides with the upper left block of the M matrix of Section 3, while D coincides with the upper left corner of the block matrix A of Section 3.

11For further reference on heteroskedasticity and autocorrelation consistent covariance matrices see Keener et al. (Citation1991), who define it in the least squares framework, Σ = XeeX, or Weiss (Citation1990) in the median regression with autocorrelated errors,

. Fitzenberger (Citation1998) presents an heteroskedasticity and autocorrelation consistent covariance matrix computed using a moving block bootstrap procedure, which is however a computationally intensive method.

12The G matrix involves two kinds of approximations. The first is due to the substitution of the heteroskedasticity σ t  = σ(x t ), or σ t  = σ(e t−1), or σ t  = σ(v t−1), with the chosen function of the errors g(e t (θ)) = δ t  = e t (0.5) − [(2f(0.5))−2]1/2. The second approximation replaces the errors e t (θ) with the residuals from the quantile regressions u t (θ).

13Portnoy (Citation1992) uses this measure to test for independence. If this difference is not significantly different from zero, then the data are independent. The test relies the idea of Koenker and Bassett that i.i.d. errors generate parallel quantile regressions.

14This exclusion in the componentwise average is needed to avoid unwanted simplification in Hζ* = Hζ − H d (θ).

15The simulations have been implemented using Stata, version 6.

16A closely related model is discussed in Chow (Citation1984) and in Judge et al. (Citation1985). We need to introduce an additional assumption on the behavior of the random coefficients. If we assume that the random regression coefficients follow an AR(p) process (or an AR(1), to simplify notation), the model here considered coincides with the one discussed in Chow (p. Citation1234, 1984), and the same stability conditions apply. If we assume instead that the random component is i.i.d., independent of ϵ t , having zero mean and a given constant variance, we need to assume that the matrix of coefficient α is nonnegative definite, as in Judge et al. (p. 808, Citation1985).

17With respect to the usual stationarity conditions of an ARCH model, that the roots of the characteristic equation lie inside the unit circle, we need to add a constraint on the variance of r t . The term r t is usually assumed to be i.i.d. with zero mean, constant variance, and independent of ϵ t . By setting this variance smaller than unity we prevent instability, since the equation for the autoregressive conditional variance states that var(e t |I t−1) = σϵ 2 + σ r 2 v t−1 2 = α0 + α1 v t−1 2. For an ARCH(q) process the same condition has to hold with respect to the sum of the coefficients of the auxiliary regression, that is ∑ i=1,q α1 < 1, with the exclusion of the intercept which refers to σϵ 2.

18Alternatively, the auxiliary regression could be defined as |e t | = α0 + α1|v t−1|, with objective functions ∑ t=1,n ρθ(y t  − κy t−1 − (x t  − κx t−1)′β) and ∑ t=1,n ρθ(|e t | − α1|v t−1|), and gradient

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,209.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.