2,386
Views
2
CrossRef citations to date
0
Altmetric
Theory and Methods

Simultaneous Decorrelation of Matrix Time Series

ORCID Icon, ORCID Icon, & ORCID Icon
Pages 957-969 | Received 20 May 2021, Accepted 20 Nov 2022, Published online: 11 Jan 2023

References

  • Ahn, S. C., and Horenstein, A. R. (2013), “Eigenvalue Ratio Test for the Number of Factors,” Econometrica, 81, 1203–1227.
  • Back, A. D., and Weigend, A. S. (1997), “A First Application of Independent Component Analysis to Extracting Structure from Stock Returns,” International Journal of Neural Systems, 8, 473–484. DOI: 10.1142/s0129065797000458.
  • Bai, J., and Ng, S. (2002), “Determining the Number of Factors in Approximate Factor Models,” Econometrica, 70, 191–221.
  • Basu, S., Dunagan, J., Duh, K., and Muniswamy-Reddy, K.-K. (2012), “Blr-d: Applying Bilinear Logistic Regression to Factored Diagnosis Problems,” ACM SIGOPS Operating Systems Review, 45, 31–38.
  • Basu, S., and Michailidis, G. (2015), “Regularized Estimation in Sparse High-Dimensional Time Series Models,” The Annals of Statistics, 43, 1535–1567.
  • Bickel, P. J., and Levina, E. (2008a), “Covariance Regularization by Thresholding,” The Annals of Statistics, 36, 2577–2604.
  • ——— (2008b), “Regularized Estimation of Large Covariance Matrices,” The Annals of Statistics, 36, 199–227.
  • Box, G. E. P., and Jenkins, G. M. (1970), Times Series Analysis. Forecasting and Control, San Francisco, CA.-London-Amsterdam: Holden-Day.
  • Bradley, R. C. (2005), “Basic Properties of Strong Mixing Conditions. A Survey and Some Open Questions,” Probability Surveys, 2, 107–144.
  • Chang, J., Guo, B., and Yao, Q. (2018), “Principal Component Analysis for Second-Order Stationary Vector Time Series,” The Annals of Statistics, 46, 2094–2124.
  • Chang, J., He, J., Yang, L., and Yao, Q. (2021), “Modelling Matrix Time Series via a Tensor CP-Decomposition,” arXiv:2112.15423.
  • Chen, E. Y., Tsay, R. S., and Chen, R. (2020), “Constrained Factor Models for High-Dimensional Matrix-Variate Time Series,” Journal of the American Statistical Association, 115, 775–793.
  • Chen, E. Y., Yun, X., Chen, R., and Yao, Q. (2020), “Modeling Multivariate Spatial-Temporal Data with Latent Low-Dimensional Dynamics,” arXiv preprint arXiv:2002.01305.
  • Chen, R., Xiao, H., and Yang, D. (2021), “Autoregressive Models for Matrix-Valued Time Series,” Journal of Econometrics, 222, 539–560.
  • Chen, R., Yang, D., and Zhang, C.-H. (2022), “Factor Models for High-Dimensional Tensor Time Series,” Journal of the American Statistical Association, 117, 94–116.
  • Fan, J., and Yao, Q. (2003), Nonlinear Time Series: Nonparametric and Parametric Methods. Springer Series in Statistics. New York: Springer.
  • Forni, M., Hallin, M., Lippi, M., and Reichlin, L. (2005), “The Generalized Dynamic Factor Model: One-Sided Estimation and Forecasting,” Journal of the American Statistical Association, 100, 830–840.
  • Gao, Z., Ma, Y., Wang, H., and Yao, Q. (2019), “Banded Spatio-Temporal Autoregressions,” Journal of Econometrics, 208, 211–230.
  • Ghosh, S., Khare, K., and Michailidis, G. (2019), “High-Dimensional Posterior Consistency in Bayesian Vector Autoregressive Models,” Journal of the American Statistical Association, 114, 735–748. DOI: 10.1080/01621459.2018.1437043.
  • Guo, S., Wang, Y., and Yao, Q. (2016), “High-Dimensional and Banded Vector Autoregressions,” Biometrika, 103, 889–903.
  • Han, Y., Chen, L., and Wu, W. (2020), “Sparse Nonlinear Vector Autoregressive Models,” Technical Report.
  • Han, Y., Chen, R., Yang, D., and Zhang, C.-H. (2020), “Tensor Factor Model Estimation by Iterative Projection,” arXiv preprint arXiv:2006.02611.
  • Han, Y., Chen, R., and Zhang, C.-H. (2022), “Rank Determination in Tensor Factor Model,” Electronic Journal of Statistics, 16, 1726–1803.
  • Han, Y., and Tsay, R. S. (2020), “High-Dimensional Linear Regression for Dependent Observations with Application to Nowcasting,” Statistica Sinica, 30, 1797–1827.
  • Han, Y., Tsay, R. S., and Wu, W. B. (2023), “High Dimensional Generalized Linear Models for Temporal Dependent Data,” Bernoulli, 29, 105–131.
  • Han, Y., and Zhang, C.-H. (2022), “Tensor Principal Component Analysis in High Dimensional CP Models,” IEEE Transactions on Information Theory, forthcoming.
  • Han, Y., Zhang, C.-H., and Chen, R. (2021), “CP Factor Model for Dynamic Tensors,” arXiv:2110.15517.
  • Hoff, P. D. (2015), “Multilinear Tensor Regression for Longitudinal Relational Data,” Annals of Applied Statistics, 9, 1169–1193. DOI: 10.1214/15-AOAS839.
  • Huang, D., and Tsay, R. (2014), “A Refined Scalar Component Approach to Multivariate Time Series Modeling,” Manuscript.
  • Lam, C., and Yao, Q. (2012), “Factor Modeling for High-Dimensional Time Series: Inference for the Number of Factors,” The Annals of Statistics, 40, 694–726.
  • Lam, C., Yao, Q., and Bathia, N. (2011), “Estimation of Latent Factors for High-Dimensional Time Series,” Biometrika, 98, 901–918.
  • Lin, J., and Michailidis, G. (2017), “Regularized Estimation and Testing for High-Dimensional Multi-Block Vector-Autoregressive Models,” The Journal of Machine Learning Research, 18, 4188–4236.
  • Liu, W., Xiao, H., and Wu, W. B. (2013), “Probability and Moment Inequalities under Dependence,” Statistica Sinica, 23, 1257–1272.
  • Matteson, D. S., and Tsay, R. S. (2011), “Dynamic Orthogonal Components for Multivariate Time Series,” Journal of the American Statistical Association, 106, 1450–1463.
  • Merlevède, F., Peligrad, M., and Rio, E. (2011), “A Bernstein Type Inequality and Moderate Deviations for Weakly Dependent Sequences,” Probability Theory and Related Fields, 151, 435–474.
  • Negahban, S., and Wainwright, M. J. (2011), “Estimation of (near) Low-Rank Matrices with Noise and High-Dimensional Scaling,” The Annals of Statistics, 39, 1069–1097.
  • Pan, J., and Yao, Q. (2008), “Modelling Multiple Time Series via Common Factors,” Biometrika, 95, 365–379.
  • Pena, D., and Box, G. E. (1987), “Identifying a Simplifying Structure in Time Series,” Journal of the American statistical Association, 82, 836–843.
  • Rio, E. (2000), “Théorie asymptotique des processus aléatoires faiblement dépendants, volume 31 of Mathématiques & Applications (Berlin) [Mathematics & Applications]. Berlin: Springer.
  • Rohde, A., and Tsybakov, A. B. (2011), “Estimation of High-Dimensional Low-Rank Matrices,” The Annals of Statistics, 39, 887–930.
  • Stewart, G. W., and Sun, J. G. (1990), Matrix Perturbation Theory. Computer Science and Scientific Computing. Boston, MA: Academic Press, Inc.
  • Tiao, G. C., and Tsay, R. S. (1989), “Model Specification in Multivariate Time Series,” Journal of the Royal Statistical Society, Series B, 51, 157–195.
  • Wang, D., Liu, X., and Chen, R. (2019), “Factor Models for Matrix-Valued High-Dimensional Time Series,” Journal of Econometrics, 208, 231–248.
  • Wang, L., Zhang, Z., and Dunson, D. (2019), “Symmetric Bilinear Regression for Signal Subgraph Estimation,” IEEE Transactions on Signal Processing, 67, 1929–1940.
  • Wu, W.-B., and Wu, Y. N. (2016), “Performance Bounds for Parameter Estimates of High-Dimensional Linear Models with Correlated Errors,” Electronic Journal of Statistics, 10, 352–379.
  • Xia, D., and Yuan, M. (2019), “Statistical Inferences of Linear Forms for Noisy Matrix Completion,” arXiv preprint arXiv:1909.00116.
  • Xia, Y., Cai, T., and Cai, T. T. (2018), “Multiple Testing of Submatrices of a Precision Matrix with Applications to Identification of between Pathway Interactions,” Journal of the American Statistical Association, 113, 328–339. DOI: 10.1080/01621459.2016.1251930.
  • Xiao, H., Han, Y., and Chen, R. (2022), “Reduced Rank Autoregressive Models for Matrix Time Series,” Journal of Business & Economic Statistics, forthcoming.
  • Zhang, D., and Wu, W. B. (2017), “Gaussian Approximation for High Dimensional Time Series,” The Annals of Statistics, 45, 1895–1919.
  • Zhou, H., Li, L., and Zhu, H. (2013), “Tensor Regression with Applications in Neuroimaging Data Analysis,” Journal of the American Statistical Association, 108, 540–552.
  • Zhou, H. H., and Raskutti, G. (2018), “Non-parametric Sparse Additive Auto-Regressive Network Models,” IEEE Transactions on Information Theory, 65, 1473–1492.