Abstract
The proper combination of parametric and nonparametric regression procedures can improve upon the shortcomings of each when used individually. Considered is the situation where the researcher has an idea of which parametric model should explain the behavior of the data, but this model is not adequate throughout the entire range of the data. An extension of partial linear regression and two other methods of model-robust regression are developed and compared in this context The model-robust procedures each involve the proportional mixing of a parametric fit to the data and a nonparametric fit to either the data or residuals. Asymptotically optimal estimates for the mixing parameters are given, along with their convergence rates. Performance is based on bias and variance considerations, and theoretical mean squared error formulas are used to compare procedures. Simulation results establish the accuracy of the theoretical formulas and illustrate the potential benefits of the model-robust procedures. Two examples are given: Example 1 uses generated data from an underlying model with defined misspecification to show the theoretical benefits of the model-robust procedures, and Example 2 supplies an interesting application.