184
Views
24
CrossRef citations to date
0
Altmetric
Theory and Method

Bootstrapping in Nonparametric Regression: Local Adaptive Smoothing and Confidence Bands

&
Pages 102-110 | Received 01 Jun 1985, Published online: 12 Mar 2012
 

Abstract

The operation of the bootstrap in the context of nonparametric regression is considered. Bootstrap samples are taken from estimated residuals to study the distribution of a suitably recentered kernel estimator. The application of this principle to the problem of local adaptive choice of bandwidth and to the construction of confidence bands is investigated and compared with a direct method based on asymptotic means and variances. The technique of the bootstrap is to replace any occurrence of the unknown distribution in the definition of the statistical function of interest by the empirical distribution function of the observed errors. In a regression context these errors are not directly observed, although their role can be played by the residuals from the fitted model. In this article the fitted model is a kernel nonparametric regression estimator. Since nonparametric smoothing is involved, an additional difficulty is created by the bias incurred in smoothing. This bias, however, can be estimated in a consistent fashion. These considerations suggest the way in which the distribution of the nonparametric estimate about the true curve at some point of interest may be approximated by suitable recentering of the nonparametric estimates based on bootstrap samples. The bootstrap samples are constructed by adding to the observed estimate errors, which are randomly chosen without replacement from the collection of recentered and bias-corrected residuals from the original data. A theorem is proved to establish that the bootstrap distribution approximates the distribution of interest in terms of the Mallows metric. Two applications are considered. The first uses bootstrap sampling to approximate the mean squared error of the nonparametric estimate at some point of interest. This can then be minimized over the smoothing parameter to adapt the degree of smoothing applied at any point to the local behavior of the underlying curve. The second application uses the percentiles of the approximate distribution to construct confidence intervals for the curve at specific design points. In both of these cases the performance of the bootstrap is compared with a simple “plug-in” method based on direct estimation of the terms in an asymptotic expansion. The performances of the two methods are in general very similar. The bootstrap, however, has the slight advantage of not being as sensitive as the direct method to second derivatives near 0 in the local adaptive smoothing problem. In addition, in the construction of confidence intervals the bootstrap is able to reflect features such as skewness but falls slightly short of target confidence intervals as a result of inaccuracies in centering when the second derivative of the curve is high.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.