Abstract
This paper proposes and investigates the performance of a modified model for estimating the Box&-Cox transformation parameter under single&-sample problems. The proposed model involves a covariate introduced to force the data towards near normality. Asymptotic and simulation results show that the variance of the new estimator is, in most cases, smaller than that of the Box&-Cox normal likelihood estimator. This in turn is shown to lead to better accuracy in estimating the mean and the standard deviation of the underlying model. However, due to the bias effect of the new estimator, the Box-Cox estimator is found to perform slightly better for data sets with small dispersion and large coefficient of variation.