866
Views
0
CrossRef citations to date
0
Altmetric
Articles

An Econometric Analysis of Volatility Discovery

, &
 

Abstract

We investigate information processing in the stochastic process driving stock’s volatility (volatility discovery). We apply fractionally cointegration techniques to decompose the estimates of the market-specific integrated variances into an estimate of the common integrated variance of the efficient price and a transitory component. The market weights on the common integrated variance of the efficient price are the volatility discovery measures. We relate the volatility discovery measure to the price discovery framework and formally show their roles on the identification of the integrated variance of the efficient price. We establish the limiting distribution of the volatility discovery measures by resorting to both long span and in-fill asymptotics. The empirical application is in line with our theoretical results, as it reveals that trading venues incorporate new information into the stochastic volatility process in an individual manner and that the volatility discovery analysis identifies a distinct information process than that based on the price discovery analysis.

Supplementary Materials

It contains the proofs for Proposition 1 and Theorems 1 and 2. It also presents a review of the price discovery measures, a simulation exercise, data details, and additional empirical results (SuppVD.pdf file).

Acknowledgments

The authors are grateful to the Editor Prof. Atsushi Inoue, an anonymous associate editor, two anonymous referees, Marcelo Fernandes, Asger Lunde, Carsten Tanggaard, Pedro Valls and seminar participants at several workshops and conferences. We thank for the opportunity to run the computational analysis on the High Performance Computing Cluster supported by the Research and Specialist Computing Support service at the University of East Anglia (UEA).

Disclosure Statement

The authors report there are no competing interests to declare.

Notes

2 The finance and econometrics literature has long viewed volatility as a separate stochastic process (see the excellent survey on stochastic volatility models in Shephard and Andersen (Citation2009)).

3 Well-documented evidence is found in the literature to show that estimates of integrated variance (e.g., realized variance) depict long memory features, that is, these estimates are characterized as highly persistent and presenting an autocorrelation function that decays at a hyperbolic rate (Andersen and Bollerslev, Citation1997; Andersen et al., Citation2003; Corsi, Citation2009, among others).

4 The exact parametric functional form of the stochastic volatility process σm is not relevant to our analysis, and hence we only stress that σm is driven by a different (possibly correlated) Brownian motion from W.

5 See supplementary material for an overview on the price discovery framework.

6 This is consistent with previous empirical findings, such as those of Rossi and de Magistris (Citation2014), among others, who find estimates of d greater than 0.5.

7 See supplementary material for the details.

8 Similarly to the orthogonal complements of the parameters in the VEC model, α and β are not unique and hence β and α are also not fully identified. Without loss of generality, we impose the normalizations β=ιS and αιS=1.

9 The drift component is slower moving that the diffusion component, resulting in zero quadratic variation. As a result, the parameterization of the drift in Assumption 1 does not impact the quadratic variation of observed prices, which remains the same as in (3).

10 In line with the empirical evidence in Section 5, we simplify the exposition and set κ = 0 and d = b.

11 The fBm is defined as 0((tu)d(u)d) dWu+0t(tu)dΓ(1+d) dWu, where Γ is the Gamma function. The fBm process can also be written in terms of the Hurst index H rather than d, such that d=H1/2. Comte, Coutin, and Renault (Citation2012) proposes an affine class of long memory volatility processes that involve fractional integration of the square root of the spot volatility process, rather than its logarithm as in (12). Additionally, the fractional OU in (12) can be written in terms of log(σm,t2) (Rossi and de Magistris, Citation2014).

12 Assumptions 2 and 3 are equivalent to Assumptions 1–4 in Johansen and Nielsen (Citation2012).

13 Johansen and Nielsen’s (Citation2012) Theorem 4 yields that the likelihood function has a strict minimum at the true vector of the parameters and converges uniformly to its deterministic limit.

14 The Reg NMS in 2007 allows the entry of new trading venues that are linked together and compete for order flow, liquidity, and trades.

15 Estimation results for the FCVAR models are obtained using the computer program by Nielsen and Popiel (Citation2014).

16 In addition to the requirement that residuals should be a white noise process, we choose κ so that the roots of the characteristic polynomials lie outside the transformed unit circle, (see Johansen, Citation2008 for a theoretical discussion on identification).

17 We do not report the p-values when the null hypothesis is rk(αβ)=0 because we strongly reject the null for the 30 assets in all market combinations.

18 Tables S.5 and S.6 in the supplementary material report the estimates of α and the LM-test for serial correlation in the residuals, respectively.

19 Because we find κ = 0 for nearly all stocks (see Table S.4 in the supplementary material), the hypothesis tests below can be most often interpreted as Granger causality tests.

20 See Table S.7 in the supplementary material for the p-values of the volatility discovery measures.

21 See supplementary material for an overview of the price discovery analysis and the estimation details.