99
Views
9
CrossRef citations to date
0
Altmetric
Miscellany

On the bi-dimensionality of liquidity

, &
Pages 542-566 | Published online: 19 Aug 2006
 

Abstract

Variations in overall liquidity can be measured by simultaneous changes in both immediacy costs and depth. Liquidity changes, however, are ambiguous whenever both liquidity dimensions do not reinforce each other. In this paper, ambiguity is characterized using an instantaneous time-varying elasticity concept. Several bi-dimensional liquidity measures that cope with the ambiguity problem are constructed. First, it is shown that bi-dimensional measures are superior since commonalities in overall liquidity cannot be fully explained by the common factors in one-dimensional proxies of liquidity. Second, it is shown that an infinitesimal variation in either market volatility or trading activity augments the probability of observing an unambiguous liquidity adjustment. Ambiguity strongly depends on the expected (deterministic) component of volatility.

Acknowledgements

Pascual is grateful to the Fundación Caja Madrid for its financial support and Escribano to the Secretaría de Estado de Educación y Universidades PR2003-0305. The authors acknowledge financial support from Spanish DGICYT project #PB98-0030. We would like to thank participants at the European Financial Management 1999 Meetings in Paris (France), Foro de Finanzas 1999 in Segovia (Spain), VII Foro de Finanzas 1999 in Valencia (Spain) and the 10th (EC)2 Conference on Financial Econometrics 1999 in Madrid (Spain). The authors are especially grateful to Gonzalo Rubio and Daniel Cardona for their helpful suggestions. The contents of this paper are the sole responsibility of the authors.

Notes

The liquidity ratio (LR) is just the ratio of accumulated trading volume to accumulated change in prices during a given period. It ignores whether volume is buyer or seller initiated. This measure is extremely sensitive: small changes in prices move the liquidity ratio to extremely high values. In our opinion, the sensitivity of prices to order flow is better captured using VNET and it has better properties.

With systematic sampling (SS), all stocks have the same probability of being picked up, as with simple random sampling (SRS). However, the final SS-sample is more representative than the SRS-sample. SS consists of generating a random number k between 1 and the nearest integer to 2574/150. Then, the population is sorted by market capitalization and the stocks selected are those in the rkth positions, where r = {1, 2, …, 150} (see Som, Citation1996, pp. 81–90 for details).

We have also considered ex-post liquidity measures, the effective spread weighted by volume to proxy for immediacy costs and the VNET weighted by volume to measure market depth. In addition, the relative spread and the quoted depth have also been computed using the last quotes in each time interval and the effective spread has been computed using an unweighted average. The empirical results for these alternative measures are not reported because of space limitations, but they are available upon request. In general, our main conclusions are independent of the one-dimensional proxies considered.

Since Hasbrouck and Seppi (Citation2001) are interested in the stochastic sources of variability, they standardize the variables so that the deterministic time-of-day effects are removed. We choose not to differentiate between stochastic and deterministic sources of common variation since we are also interested in discerning whether the deterministic sources of variation of overall liquidity are the same as those of the one-dimensional measures. Therefore, we standardize by simply subtracting the mean and dividing by the standard deviation. Although principal components analysis is sensible to the units in which the variables are measured, the standardization is not strictly necessary since it is equivalent to extracting the eigenvectors and eigenvalues from the correlation matrix instead of the variance-covariance matrix. Nonetheless, standardization facilitates certain comparative analyses.

If we assume a logistic distribution, we would get a Logit model. Because the cumulative normal distribution and the logistic distribution are very close to each other, except at the tails, we do not obtain very different results using Logit models.

Detailed results are not reported because of the limited space, but they are available upon request from the authors.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.