Abstract
In sequential analysis, Bayes stopping rules are often difficult to determine explicitly. Bickel and Yahav (1967, Proceedings of the 5th Berkeley Symposium on Mathematical Statistics and Probability, VI, pp, 401-413) provided an attractive large sample approximation to sequential Bayes rules which they called “asymptotically pointwise optimal” (A.P.O.) rules. The present paper proposes A.P.O. rules for certain hierarchical Bayes regression models. These rules are shown to be asymptotically “non—deficient” in the sense of Woodroofe (1981Zietschrift füor Wahrscheinlich—keitsthoeorie und Verwandte Gebiete, 58, pp. 331-341). This work extends the results of Ghosh and Hoekstra (1989Sequential Analysis, 8, pp. 79–100) to a multivariate regression setting, and this considerably extends their applicability