195
Views
5
CrossRef citations to date
0
Altmetric
Original Articles

Information-Theoretic Distribution Test with Application to Normality

&
Pages 307-329 | Published online: 07 Jan 2010
 

Abstract

We derive general distribution tests based on the method of maximum entropy (ME) density. The proposed tests are derived from maximizing the differential entropy subject to given moment constraints. By exploiting the equivalence between the ME and maximum likelihood (ML) estimates for the general exponential family, we can use the conventional likelihood ratio (LR), Wald, and Lagrange multiplier (LM) testing principles in the maximum entropy framework. In particular, we use the LM approach to derive tests for normality. Monte Carlo evidence suggests that the proposed tests are compatible with and sometimes outperform some commonly used normality tests. We show that the proposed tests can be extended to tests based on regression residuals and non-i.i.d. data in a straightforward manner. An empirical example on production function estimation is presented.

JEL Classification:

ACKNOWLEDGMENTS

We want to thank the associate editor, two anonymous referees, seminar participants at Penn State University, the 2004 European Meeting of the Econometric Society, and the 2004 Canadian Econometrics Study Group for comments. Financial support from SSHRC of Canada is gratefully acknowledged.

Notes

Let γ′ = [γ0, γ1,…, γ K ] be a nonzero vector and g 0(x) = 1, we have

Hence, ℋ is positive-definite.

Denote θ m  = [θ1,…, θ m ]. The only case that θ m+1 = 0 is when the moment restriction is not binding, or the (m + 1)th moment is identical to its prediction based on the ME density f(x; θ m ) from the first m moments. In this case, the (m + 1)th moment contains no additional information that can further reduce the entropy.

Imbens et al. (Citation1998) discussed similar tests in the IT generalized empirical likelihood framework. The proposed tests differ from their tests, which minimize the discrete Kullback–Leibler information criterion (cross entropy) or other Cressie–Read family of discrepancy indices subject to moment constraints.

A t distribution with one degree of freedom is the Cauchy distribution, which has the fattest tails within the family of t distributions. See also Lye and Martin (Citation1994) on the connection between testing for normality and the generalized Student t distribution. Premaratne and Bera (Citation2005) also used the moment function tan −1(x).

We also tried [tan −1(x)]2. The performance was essentially the same as that with tan −1(x 2).

Shannon (Citation1949) shows that among all distributions that possess a density function f(x) and have a given variance σ2, the entropy is maximized by the normal distribution. The entropy of the normal distribution with variance σ2 is . Vasicek (Citation1976) uses this property to test a composite hypothesis of normality, based on a nonparametric estimates of sample entropy.

The first two moments are zero and one by standardization.

We thank a referee for this reference.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 578.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.