298
Views
0
CrossRef citations to date
0
Altmetric
Networks and Matrices

Fused-Lasso Regularized Cholesky Factors of Large Nonstationary Covariance Matrices of Replicated Time Series

ORCID Icon &
Pages 157-170 | Received 15 Aug 2021, Accepted 11 Jun 2022, Published online: 19 Jul 2022

References

  • Adak, S. (1998), “Time-Dependent Spectral Analysis of Nonstationary Time Series,” Journal of the American Statistical Association, 93, 1488–1501. DOI: 10.1080/01621459.1998.10473808.
  • Ansley, C. F. (1979), “An Algorithm for the Exact Likelihood of a Mixed Autoregressive-Moving Average Process,” Biometrika, 66, 59–65. DOI: 10.1093/biomet/66.1.59.
  • Banerjee, O., El Ghaoui, L., and d’Aspremont, A. (2008), “Model Selection through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data,” Journal of Machine Learning Research, 9, 485–516.
  • Beck, A., and Tetruashvili, L. (2013), “On the Convergence of Block Coordinate Descent Type Methods,” SIAM Journal on Optimization, 23, 2037–2060. DOI: 10.1137/120887679.
  • Bickel, P. J., and Gel, Y. R. (2011), “Banded Regularization of Autocovariance Matrices in Application to Parameter Estimation and Forecasting of Time Series,” Journal of the Royal Statistical Society, Series B, 73, 711–728. DOI: 10.1111/j.1467-9868.2011.00779.x.
  • Blake, T. (2018), Nonparametric Covariance Estimation with Shrinkage toward Stationary Models, Ph. D. thesis, The Ohio State University.
  • Dahlhaus, R. (1997), “Fitting Time Series Models to Nonstationary Processes,” The Annals of Statistics, 25, 1–37. DOI: 10.1214/aos/1034276620.
  • Dahlhaus, R., and Polonik, W. (2009), “Empirical Spectral Processes for Locally Stationary Time Series,” Bernoulli, 15, 1–39. DOI: 10.3150/08-BEJ137.
  • Dai, M., and Guo, W. (2004), “Multivariate Spectral Analysis Using Cholesky Decomposition,” Biometrika, 91, 629–643. DOI: 10.1093/biomet/91.3.629.
  • Dallakyan, A. (2021), SC Package, available at https://github.com/adallak/SCPackage.
  • Das, S., and Politis, D. N. (2020), “Predictive Inference for Locally Stationary Time Series with an Application to Climate Data,” Journal of the American Statistical Association, 29, 1–16.
  • Davis, A. R., Lee, C. M. T., and Gabriel, R.-Y. (2006), “Structural Break Estimation for Nonstationary Time Series Models,” Journal of the American Statistical Association, 101, 223–239. DOI: 10.1198/016214505000000745.
  • Ding, X., and Zhou, Z. (2019), “Globally Optimal and Adaptive Short-Term Forecast of Locally Stationary Time Series and a Test for its Stability,” Arxiv preprint arXiv:1912.12937.
  • Eubank, R. L., and Wang, S. (2002), “The Equivalence between the Cholesky Decomposition and the Kalman Filter,” The American Statistician, 56, 39–43. DOI: 10.1198/000313002753631349.
  • Friedman, J., Hastie, T., and Tibshirani, R. (2008), “Sparse Inverse Covariance Estimation with the Graphical Lasso,” Biostatistics, 9, 432–441. DOI: 10.1093/biostatistics/kxm045.
  • Gabriel, K. R. (1962), “Ante-Dependence Analysis of an Ordered Set of Variables,” Annals of Mathematical Statistics, 33, 201–212. DOI: 10.1214/aoms/1177704724.
  • Hodrick, R., and Prescott, E. (1997), “Postwar U.S Business Cycles: An Empirical Investigation,” Journal of Money, Credit and Banking, 29, 1–16. DOI: 10.2307/2953682.
  • Huang, J., Liu, N., Pourahmadi, M., and Liu, L. (2006), “Covariance Matrix Selection and Estimation via Penalised Normal Likelihood,” Biometrika, 93, 85–98. DOI: 10.1093/biomet/93.1.85.
  • Huang, Z. J., Liu, L., and Liu, N. (2007), “Estimation of Large Covariance Matrices of Longitudinal Data with Basis Function Approximations,” Journal of Computational and Graphical Statistics, 16, 189–209. DOI: 10.1198/106186007X181452.
  • Johnson, N. A. (2013), “A Dynamic Programming Algorithm for the Fused Lasso and l 0-Segmentation,” Journal of Computational and Graphical Statistics, 22, 246–260. DOI: 10.1080/10618600.2012.681238.
  • Kenward, M. G. (1987), “A Method for Comparing Profiles of Repeated Measurements,” Journal of the Royal Statistical Society, Series C, 36, 296–308. DOI: 10.2307/2347788.
  • Khare, K., Oh, S.-Y., Rahman, S., and Rajaratnam, B. (2019), “A Scalable Sparse Cholesky Based Approach for Learning High-Dimensional Covariance Matrices in Ordered Data,” Machine Learning, 108, 2061–2086. DOI: 10.1007/s10994-019-05810-5.
  • Khare, K., Oh, S.-Y., and Rajaratnam, B. (2015), “A Convex Pseudolikelihood Framework for High Dimensional Partial Correlation Estimation with Convergence Guarantees,” Journal of the Royal Statistical Society, Series B, 77, 803–825. DOI: 10.1111/rssb.12088.
  • Khare, K., and Rajaratnam, B. (2014), Convergence of Cyclic Coordinatewise l1 Minimization, arXiv e-prints.
  • Kim, S.-J., Koh, K., Boyd, S. P., and Gorinevsky, D. M. (2009), “l1 Trend Filtering,” SIAM Review, 51, 339–360. DOI: 10.1137/070690274.
  • Kitagawa, G., and Gersch, W. (1985), “A Smoothness Priors Time-Varying AR Coefficient Modeling of Nonstationary Covariance Time Series,” IEEE Transactions on Automatic Control, 30, 48–56. DOI: 10.1109/TAC.1985.1103788.
  • Krafty, R. T., Rosen, O., Stoffer, D. S., Buysse, D. J., and Hall, M. H. (2017), “Conditional Spectral Analysis of Replicated Multiple Time Series with Application to Nocturnal Physiology,” Journal of the American Statistical Association, 112, 1405–1416. DOI: 10.1080/01621459.2017.1281811.
  • Krock, M., Kleiber, W., and Becker, S. (2021), “Nonstationary Modeling with Sparsity for Spatial Data via the Basis Graphical Lasso,” Journal of Computational and Graphical Statistics, 30, 375–389. DOI: 10.1080/10618600.2020.1811103.
  • Levina, E., Rothman, A., and Zhu, J. (2008), “Sparse Estimation of Large Covariance Matrices via a Nested Lasso Penalty,” The Annals of Applied Statistics, 2, 245–263. DOI: 10.1214/07-AOAS139.
  • Li, X., Zhao, T., Arora, R., Liu, H., and Hong, M. (2018), “On Faster Convergence of Cyclic Block Coordinate Descent-Type Methods for Strongly Convex Minimization,” Journal of Machine Learning Research, 18, 1–24.
  • McMurry, T. L., and Politis, D. N. (2010), “Banded and Tapered Estimates for Autocovariance Matrices and the Linear Process Bootstrap,” Journal of Time Series Analysis, 31, 471–482. DOI: 10.1111/j.1467-9892.2010.00679.x.
  • McMurry, T. L., and Politis, D. N. (2015), “High-Dimensional Autocovariance Matrices and Optimal Linear Prediction,” Electronic Journal of Statistics, 9, 753–788.
  • Nandy, S., Lim, C., and Maiti, T. (2016), “Estimating Non-stationary Spatial Covariance Matrix Using Multi-Resolution Knots,” Technical report.
  • Peng, J., Wang, P., Zhou, N., and Zhu, J. (2009), “Partial Correlation Estimation by Joint Sparse Regression Models,” Journal of the American Statistical Association, 104, 735–746. DOI: 10.1198/jasa.2009.0126.
  • Pourahmadi, M. (1999), “Joint Mean-Covariance Models with Applications to Longitudinal Data: Unconstrained Parameterisation,” Biometrika, 86, 677–690. DOI: 10.1093/biomet/86.3.677.
  • Pourahmadi, M. (2001), Foundations of Time Series Analysis and Prediction Theory, New York: Wiley.
  • Pourahmadi, M. (2013), High-Dimensional Covariance Estimation, Hoboken: Wiley.
  • R Core Team (2019), R: A Language and Environment for Statistical Computing, Vienna, Austria: R Foundation for Statistical Computing.
  • Ramdas, A., and Tibshirani, R. J. (2015), “Fast and Flexible ADMM Algorithms for Trend Filtering,” Journal of Computational and Graphical Statistics, 25, 839–858. DOI: 10.1080/10618600.2015.1054033.
  • Rao, T. S. (1970), “The Fitting of Non-stationary Time-series Models with Time-Dependent Parameters,” Journal of the Royal Statistical Society, Series B, 32, 312–322. DOI: 10.1111/j.2517-6161.1970.tb00844.x.
  • Rosen, O., and Stoffer, D. S. (2007), “Automatic Estimation of Multivariate Spectra via Smoothing Splines,” Biometrika, 94, 335–345. DOI: 10.1093/biomet/asm022.
  • Rothman, J. A., Levina, E., and Zhu, J. (2010), “A New Approach to Cholesky-Based Covariance Regularization in High Dimensions,” Biometrika, 97, 539–550. DOI: 10.1093/biomet/asq022.
  • Shojaie, A., and Michailidis, G. (2010), “Penalized Likelihood Methods for Estimation of Sparse High-Dimensional Directed Acyclic Graphs,” Biometrika, 97, 519–538. DOI: 10.1093/biomet/asq038.
  • Tibshirani, R., Saunders, M., Rosset, S., Zhu, J., and Knight, K. (2005), “Sparsity and Smoothness via the Fused Lasso,” Journal of the Royal Statistical Society, 67, 91–108. DOI: 10.1111/j.1467-9868.2005.00490.x.
  • Tibshirani, R. J., and Taylor, J. (2011), “The Solution Path of the Generalized Lasso,” The Annals of Statistics, 39, 1335–1371. DOI: 10.1214/11-AOS878.
  • Tseng, P. (2001), “Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization,” Journal of Optimization Theory and Applications, 109, 475–494. DOI: 10.1023/A:1017501703105.
  • Whittaker, J. (1990), Graphical Models in Applied Multivariate Statistics, New York: Wiley.
  • Wu, W. B., and Pourahmadi, M. (2003), “Nonparametric Estimation of Large Covariance Matrices of Longitudinal Data,” Biometrika, 90, 831–844. DOI: 10.1093/biomet/90.4.831.
  • Wu, W. B., and Pourahmadi, M. (2009), “Banding Sample Autocovariance Matrices of Stationary Processes,” Statistica Sinica, 19, 1755–1768.
  • Yu, G., and Bien, J. (2017), “Learning Local Dependence in Ordered Data,” Journal of Machine Learning Research, 18, 1–60.
  • Zimmerman, D. L., and Nunez-Anton, V. A. (2010), Antedependence Models for Longitudinal Data, Chapman & Hall/CRC Monographs on Statistics & Applied Probability, New York: Taylor & Francis.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.