Abstract
In this article, we revisit the classic inference problem of minimum risk point estimation for an unknown normal mean when the variance also remains unknown. We propose an alternative three-stage sampling procedure with termination defined via Gini’s mean difference rather than the traditional sample standard deviation. A number of asymptotic properties are investigated both theoretically and empirically. An extensive set of simulations is conducted to demonstrate the remarkable performance of the new procedure. For practical purposes, we also include illustrations using real data sets on the number of days marigold seeds need to flower.
ACKNOWLEDGMENTS
The authors thank the anonymous referees, the Associate Editor, and the Editor-in-Chief for valuable suggestions on improving the early version of this article.
DISCLOSURE
The authors have no conflicts of interest to report.