Abstract
Score-based algorithms are proposed for the quickest detection of changes in unnormalized statistical models. These are models where the densities are known within a normalizing constant. These algorithms can also be applied to score-based models where the score, i.e., the gradient of log density, is known to the decision maker. Bayesian performance analysis is provided for these algorithms and compared with their classical counterparts. It is shown that strong performance guarantees can be provided for these score-based algorithms where the Kullback-Leibeler divergence between pre- and post-change densities is replaced by their Fisher divergence.
ACKNOWLEDGMENT
We thank the editor, the associate editor, and the referees for their constructive suggestions and comments. This article has significantly improved due to their feedback.
DISCLOSURE
The authors have no conflicts of interest to report.