73
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

Asymptotic biases of information and cross-validation criteria under canonical parametrization

Pages 964-985 | Received 13 Jul 2017, Accepted 22 Dec 2017, Published online: 09 Mar 2018
 

ABSTRACT

An asymptotic expansion of the cross-validation criterion (CVC) using the Kullback-Leibler distance is derived when the leave-k-out method is used and when parameters are estimated by the weighted score method. By this expansion, the asymptotic bias of the Takeuchi information criterion (TIC) is derived as well as that of the CVC. Under canonical parametrization in the exponential family of distributions when maximum likelihood estimation is used, the magnitudes of the asymptotic biases of the Akaike information criterion (AIC) and CVC are shown to be smaller than that of the TIC. Examples in typical statistical distributions are shown.

MATHEMATICS SUBJECT CLASSIFICATION:

Additional information

Funding

Grant-in-Aid for Scientific Research from the Japanese Ministry of Education, Culture, Sports, Science and Technology [JSPS KAKENHI, Grant No.17K00042].

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 1,069.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.