Abstract
Change-point analysis is the task of finding abrupt (and significant) changes in the underlying model of a signal or time series. Change-point detection methods typically involve specifying the maximum number of segments to search for and the minimum segment length. However, there is no objective way to pre-specify these two parameters, and it mostly depends upon the particular application. Within this framework, a recursive optimization algorithm is developed that is capable of exploring and fine tuning these two input parameters, and optimally segmenting a time series. This multiple change-point detection technique therefore addresses a wide class of real-life contexts and problems where the identification of optimal level shifts in a time series is the main goal. Extensive simulation results are presented and a real-life example is given to illustrate the implementation of the developed scheme in practice and to unfold its capabilities. Concluding remarks and suggestions for future research are also provided.
Acknowledgments
The authors would like to thank the Department of Epidemiological Surveillance and Intervention of the Hellenic National Public Health Organization (NPHO) for providing the influenza-like illness (ILI) rate data, collected weekly through the sentinel surveillance system. The authors would also like to thank the editor and the anonymous referees who provided valuable suggestions and comments which improved significantly the paper.
Disclosure statement
No potential conflict of interest was reported by the author(s).