Abstract
A new algorithm—backward elimination via repeated data splitting (BERDS)—is proposed for variable selection in regression. Initially, the data are partitioned into two sets {E, V}, and an exhaustive backward elimination (BE) is performed in E. For each p value cutoff α used in BE, the corresponding fitted model from E is validated in V by computing the sum of squared deviations of observed from predicted values. This is repeated m times, and the α minimizing the sum of the m sums of squares is used as the cutoff in a final BE on the entire data set. BERDS is a modification of the algorithm BECV proposed by Thall, Simon, and Grier (1992). An extensive simulation study shows that, compared to BECV, BERDS has a smaller model error and higher probabilities of excluding noise variables, of selecting each of several uncorrelated true predictors, and of selecting exactly one of two or three highly correlated true predictors. BERDS is also superior to standard BE with cutoffs .05 or .10, and this superiority increases with the number of noise variables in the data and the degree of correlation among true predictors. An application is provided for illustration.