Abstract
Approximations to the bootstrap estimate of bias and variance may be obtained by replacing the estimate to be bootstrapped by one which is linear, or
,or quadratic,
, in the resampling vector p. The bootstrap bias and variance of
and
may then be evaluated analytically. These estimators are discussed and then investigated via a Monte Carlo experiment when
is the least squares regression coefficient estimate. Included amongst these bias and variance estimates are the standard jackknife, Jaeckel's [1972] infinitesimal jackknife, and Hinkley's [1977] jackknife. Good performers included a quadratic infinitesimal jackknife and the Monte Carlo evaluated bootstrap when the number of replicates was about 6 or 7 times the sample size. Poor performers included the standard jackknife and Hinkley's. Further, when computational costs were held equal, bootstrapping by Monte Carlo did better than the jackknife.