99
Views
9
CrossRef citations to date
0
Altmetric
Miscellany

On the bi-dimensionality of liquidity

, &
Pages 542-566 | Published online: 19 Aug 2006
 

Abstract

Variations in overall liquidity can be measured by simultaneous changes in both immediacy costs and depth. Liquidity changes, however, are ambiguous whenever both liquidity dimensions do not reinforce each other. In this paper, ambiguity is characterized using an instantaneous time-varying elasticity concept. Several bi-dimensional liquidity measures that cope with the ambiguity problem are constructed. First, it is shown that bi-dimensional measures are superior since commonalities in overall liquidity cannot be fully explained by the common factors in one-dimensional proxies of liquidity. Second, it is shown that an infinitesimal variation in either market volatility or trading activity augments the probability of observing an unambiguous liquidity adjustment. Ambiguity strongly depends on the expected (deterministic) component of volatility.

Acknowledgements

Pascual is grateful to the Fundación Caja Madrid for its financial support and Escribano to the Secretaría de Estado de Educación y Universidades PR2003-0305. The authors acknowledge financial support from Spanish DGICYT project #PB98-0030. We would like to thank participants at the European Financial Management 1999 Meetings in Paris (France), Foro de Finanzas 1999 in Segovia (Spain), VII Foro de Finanzas 1999 in Valencia (Spain) and the 10th (EC)2 Conference on Financial Econometrics 1999 in Madrid (Spain). The authors are especially grateful to Gonzalo Rubio and Daniel Cardona for their helpful suggestions. The contents of this paper are the sole responsibility of the authors.

Notes

The liquidity ratio (LR) is just the ratio of accumulated trading volume to accumulated change in prices during a given period. It ignores whether volume is buyer or seller initiated. This measure is extremely sensitive: small changes in prices move the liquidity ratio to extremely high values. In our opinion, the sensitivity of prices to order flow is better captured using VNET and it has better properties.

With systematic sampling (SS), all stocks have the same probability of being picked up, as with simple random sampling (SRS). However, the final SS-sample is more representative than the SRS-sample. SS consists of generating a random number k between 1 and the nearest integer to 2574/150. Then, the population is sorted by market capitalization and the stocks selected are those in the rkth positions, where r = {1, 2, …, 150} (see Som, Citation1996, pp. 81–90 for details).

We have also considered ex-post liquidity measures, the effective spread weighted by volume to proxy for immediacy costs and the VNET weighted by volume to measure market depth. In addition, the relative spread and the quoted depth have also been computed using the last quotes in each time interval and the effective spread has been computed using an unweighted average. The empirical results for these alternative measures are not reported because of space limitations, but they are available upon request. In general, our main conclusions are independent of the one-dimensional proxies considered.

Since Hasbrouck and Seppi (Citation2001) are interested in the stochastic sources of variability, they standardize the variables so that the deterministic time-of-day effects are removed. We choose not to differentiate between stochastic and deterministic sources of common variation since we are also interested in discerning whether the deterministic sources of variation of overall liquidity are the same as those of the one-dimensional measures. Therefore, we standardize by simply subtracting the mean and dividing by the standard deviation. Although principal components analysis is sensible to the units in which the variables are measured, the standardization is not strictly necessary since it is equivalent to extracting the eigenvectors and eigenvalues from the correlation matrix instead of the variance-covariance matrix. Nonetheless, standardization facilitates certain comparative analyses.

If we assume a logistic distribution, we would get a Logit model. Because the cumulative normal distribution and the logistic distribution are very close to each other, except at the tails, we do not obtain very different results using Logit models.

Detailed results are not reported because of the limited space, but they are available upon request from the authors.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 490.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.