Abstract
Our main concern is on the second-order asymptotic optimality problem of estimators. The φ-divergence loss is used as a criterion for evaluating the performance of estimators. In the comparison problem of any two estimators, the condition that one estimator dominates another estimator under the φ-divergence risk is given by evaluating the second-order term in the difference between the risks. As a result, it is proved that the condition is characterized by a peculiar value of the φ-divergence loss, which is called the divergence-loss coefficient. Furthermore, it is shown that the comparison based on the φ-divergence loss does not correspond with that based on any standard loss functions including the mean squared error, the absolute loss and the 0-1 loss. In addition, a necessary and sufficient condition for an estimator to be second-order admissible is derived.