17
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

Bootstrapping by monte carlo versus approximating the estimator and bootstrapping exactly: Cost and performance

Pages 395-424 | Received 01 Feb 1985, Published online: 27 Jun 2007
 

Abstract

Approximations to the bootstrap estimate of bias and variance may be obtained by replacing the estimate to be bootstrapped by one which is linear, or ,or quadratic,, in the resampling vector p. The bootstrap bias and variance of and may then be evaluated analytically. These estimators are discussed and then investigated via a Monte Carlo experiment when is the least squares regression coefficient estimate. Included amongst these bias and variance estimates are the standard jackknife, Jaeckel's [1972] infinitesimal jackknife, and Hinkley's [1977] jackknife. Good performers included a quadratic infinitesimal jackknife and the Monte Carlo evaluated bootstrap when the number of replicates was about 6 or 7 times the sample size. Poor performers included the standard jackknife and Hinkley's. Further, when computational costs were held equal, bootstrapping by Monte Carlo did better than the jackknife.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.