329
Views
12
CrossRef citations to date
0
Altmetric
Original Articles

A Class of Goodness-of-fit Tests Based on Transformation

, &
Pages 1708-1735 | Received 15 Apr 2011, Accepted 04 Mar 2012, Published online: 09 Apr 2014
 

Abstract

There is an increasing number of goodness-of-fit tests whose test statistics measure deviations between the empirical characteristic function and an estimated characteristic function of the distribution in the null hypothesis. With the aim of overcoming certain computational difficulties with the calculation of some of these test statistics, a transformation of the data is considered. To apply such a transformation, the data are assumed to be continuous with arbitrary dimension, but we also provide a modification for discrete random vectors. Practical considerations leading to analytic formulas for the test statistics are studied, as well as theoretical properties such as the asymptotic null distribution, validity of the corresponding bootstrap approximation, and consistency of the test against fixed alternatives. Five applications are provided in order to illustrate the theory. These applications also include numerical comparison with other existing techniques for testing goodness-of-fit.

Mathematics Subject Classification:

Appendix

The list of required assumptions is as follows.

Assumption 1.

Let X1, X2, …, Xn be iid distributed random vectors with df F and let . Assume that there exists θ ∈ intΘ, with θ = θ0 if F(.) = F(.; θ0), such that and the components of l(x; θ) = (l1(x; θ), l2(x; θ), …., lp(x; θ))′ satisfy

Assumption 2.

, for some θ ∈ intΘ, with θ = θ0 if F(.) = F(.; θ0).

Assumption 3.

The marginal and the conditional distributions of F(x; γ) are continuously differentiable with respect to γ, ∀γ in a open neighborhood of θ, , and satisfy where and τ(x; θ) is as defined in (Equation4) and (Equation5).

A sufficient condition for Assumption 3 to hold is that where G(x; θ) represents any marginal or conditional distribution of F(x; θ).

Assumption 4.

Σl(γ) is continuous at γ = θ, where Σl(γ) = ∫l(x; γ)l(x; γ)′ dF(x; γ) and l(x; γ) is defined in Assumption 1.

Assumption 5.

The marginal and the conditional distributions of F(x; γ) are differentiable with respect to γ, ∀γ in a open neighborhood of θ, , and satisfy where and τ(r)(x; θ) denotes the Rosenblatt transformation for the r-th coordinate permutation, 1 ⩽ rd!.

Assumption 6.

As γ → ∞, where Λ is a neighborhood of θ.

Proof of Theorems 3.1 and 4.1.

Theorems 3.1 and 4.1 follow from Theorems 3.1 and 4.1 in Meintanis and Swanepoel (Citation2007), respectively, by applying the Cramér Wold device. Although Theorems 3.1 and 4.1 in Meintanis and Swanepoel (Citation2007) assume that the data are univariate, the dimension of the data plays no role in their proof, so, the results in these theorems are also valid for d-dimensional data, for any fixed d ⩾ 1.

Proof of Theorem 5.1.

Since and , by the Dominated Convergence Theorem and the SLLN, and in order to prove the result it suffices to see that (17) , where g is as defined in (Equation8). First order Taylor expansion gives (18) where , for some α ∈ (0, 1). By Assumption 5 and , the right side of (Equation18) is oP(1), , and thus (Equation17) holds.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,069.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.