Abstract
Let f λ be a kernel estimate (with window width λ) of the density f. Its performance is assessed by the Kullback-Leibler information distance I(f, f λ) = ∫ f log f − ∫ f log f λ. This article establishes conditions for the asymptotic equivalence of the cross-validation estimate and the jackknife estimate of the term ∫ f log f λ, and provides the common limiting value. This gives insight into the “modified likelihood” criterion for choosing λ, introduced by Habbema, Hermans, and Van den Broek (1974) and Duin (1976).