47
Views
7
CrossRef citations to date
0
Altmetric
Theory and Method

Best Median-Unbiased Estimation in Linear Regression with Bounded Asymmetric Loss Functions

&
Pages 886-893 | Received 01 Sep 1985, Published online: 12 Mar 2012
 

Abstract

This article considers optimal median-unbiased estimation in a linear regression model with the distribution of the errors lying in a subclass of the elliptically symmetric distributions. The generalized least squares (GLS) estimator is shown to be best for any monotone loss function, that is, any loss function that is nondecreasing as the magnitude of underestimation or overestimation increases. This includes bounded asymmetric loss functions. For the same loss functions, a restricted GLS estimator is shown to be best when the estimand is known to lie in an interval. For the case of normal errors, a best median-unbiased estimator of the error variance σ2 is given for restricted and unrestricted parameter spaces. This estimator differs from the sample variance s 2. In comparison with best mean-unbiased estimators of regression and variance parameters, the best median-unbiased estimators considered here take advantage of restrictions on the parameter space and are optimal with respect to a much wider class of loss functions—in particular, both bounded and unbounded loss functions.

The choice of median-unbiasedness, as opposed to mean-unbiasedness, is not crucial when deriving an optimality result for the estimation of regression parameters when the model has elliptically symmetric errors, provided the parameter space is unrestricted or is restricted only by linear constraints. The reason is that many estimators considered in the literature have symmetric distributions about the estimand in this context and hence are both median- and mean-unbiased if their expectations exist. (Proper Bayes and shrinkage estimators are the two main classes of estimators that do not have symmetric distributions and are neither mean- nor median-unbiased.)

On the other hand, if the parameter space of the regression parameters is restricted by nonlinear constraints on the parameters, then the mean-unbiasedness condition becomes much more restrictive than median-unbiasedness. This occurs because estimators that take advantage of the restrictions on the parameters generally are mean-biased. Median-unbiased estimators, however, can be adjusted to take account of restrictions without losing their property of median-unbiasedness. Thus our use of the condition of median-unbiasedness, rather than mean-un-biasedness, is of little consequence when the parameter space is unrestricted and is a distinct advantage when the parameter space is restricted by nonlinear constraints on the parameters.

The class of error distributions that we consider consists of distributions that are consistent with elliptical symmetry for any sample size. Such distributions are rotated variance mixtures of multivariate normal distributions (and hence include multivariate normal distributions). Examples are given of cases in which such distributions may arise.

The contents of this article are organized as follows. Section 1 briefly reviews recent results by Kariya (1985) and Hwang (1985) that are related to the results given here. Section 2 shows that the GLS estimator is the best median-unbiased estimator of the regression parameters for quite general loss functions, when the parameter space is unrestricted. Of note is the fact that this result holds without moment restrictions. Thus the errors may have a multivariate Cauchy distribution. Section 3 shows that a restricted GLS estimator is best median-unbiased for a linear combination of the regression parameters, when that linear combination is restricted to lie in an interval. Certain other linear combinations of the parameter vector may be subject to arbitrary additional restrictions. Section 4 presents best median-unbiased estimators of the error variance σ2, as well as monotone functions of σ2, when the errors are normally distributed. If σ2 is constrained to lie in a finite interval, the best estimator is a censored version of its unconstrained counterpart. When σ2 is constrained only to be positive, the best median-unbiased estimator is always larger than the best mean-unbiased estimator s 2 and is approximately equal to s 2 calculated with its degrees of freedom reduced by .66. The Appendix gives proofs of the results. These make use of results due to Lehmann (1959) and Pfanzagl (1979).

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.