ABSTRACT
Although there is a significant literature on the asymptotic theory of Bayes factor, the set-ups considered are usually specialized and often involves independent and identically distributed data. Even in such specialized cases, mostly weak consistency results are available. In this article, for the first time ever, we derive the almost sure convergence theory of Bayes factor in the general set-up that includes even dependent data and misspecified models. Somewhat surprisingly, the key to the proof of such a general theory is a simple application of a result of Shalizi to a well-known identity satisfied by the Bayes factor. Supplementary materials for this article are available online.
Supplementary Materials
Section S-1 provides Shalizi's assumptions regarding posterior convergence, Section S-2 illustrates our Bayes factor result with competing AR(1) models, and Section S-3 discusses the applicability of our Bayes factor result to some infinite-dimensional models.
Acknowledgment
The authors are sincerely grateful to the Editor, the associate editor, and the referee whose detailed constructive comments have led to significant improvement of their article.