References
- Albert, R., and Barabasi, A.L. (2002), ‘Statistical Mechanics of Complex Networks’, Reviews of Modern Physics, 74, 47–97.
- Bertsekas, D.P. (1999), Nonlinear Programming (2nd ed.), Nashua, NH: Athena Scientific.
- Bremaud, P. (1999), Markov Chains – Gibbs Fields, Monte Carlo Simulation, and Queues, New York, NY: Springer.
- Brownlee, A.E.I., Regnier-Coudert, O., McCall, J.A.W., Massie, S., and Stulajter, S. (2013), ‘An Application of a GA With Markov Network Surrogate to Feature Selection’, International Journal of Systems Science, 44(11), 2039–2056.
- Cai, J.F., Candès, E.J., and Shen, Z. (2010), ‘A Singular Value Thresholding Algorithm for Matrix Completion’, SIAM Journal of Optimization, 20(4), 1956–1982.
- Combettes, P.L., and Pesquet, J. (2011), ‘Proximal Splitting Methods in Signal Processing’, in Fixed-Point Algorithms for Inverse Problems in Science and Engineering Springer Optimization and Its Applications, New York: Springer, pp. 185–212.
- Combettes, P.L., and Wajs, V.R. (2005), ‘Signal Recovery by Proximal Forward-Backward Splitting’, Multiscale Modeling and Simulation, 4, 1168–1200.
- Cover, T.M., and Thomas, J.A. (1991), Elements of Information Theory (1st ed.), New York, NY: John Wiley & Sons, Inc.
- Deng, K., and Huang, D. (2012), ‘Model Reduction of Markov Chains via Low-Rank Approximation’, in Proceedings of American Control Conference, Montréal, Canada, pp. 2651–2656.
- Deng, K., Mehta, P.G., and Meyn, S.P. (2009b), ‘A Simulation-Based Method for Aggregating Markov Chains’, in Proceedings of IEEE Conference on Decision and Control, Shanghai, China, pp. 4710–4716.
- Deng, K., Mehta, P.G., and Meyn, S.P. (2011a), ‘Optimal Kullback-Leibler Aggregation via Spectral Theory of Markov Chains’, IEEE Transactions on Automatic Control, 56(12), 2793–2808.
- Deng, K., Mehta, P.G., Meyn, S.P., and Vidyasagar, M. (2011b), ‘A Recursive Learning Algorithm for Model Reduction of Hidden Markov Models’, in Proceedings of IEEE Conference on Decision and Control, Orlando, FL, pp. 4674–4679.
- Deng, K., Sun, Y., Mehta, P.G., and Meyn, S.P. (2009a), ‘An Information-Theoretic Framework to Aggregate a Markov Chain’, in Proceedings of American Control Conference, St. Louis, MO, pp. 731–736.
- Deuflhard, P., Huisinga, W., Fischer, A., and Schütte, C. (2000), ‘Identification of Almost Invariant Aggregates in Reversible Nearly Uncoupled Markov Chains’, Linear Algebra with Applications, 315(1–3), 39–59.
- E, W., Li, T., and Vanden-Eijnden, E. (2008), ‘Optimal Partition and Effective Dynamics of Complex Networks’, Proceedings of the National Academy of Sciences of the United States of America, 105(23), 7907–7912.
- Eckart, C., and Young, G. (1936), ‘The Approximation of One Matrix by Another of Lower Rank’, Psychometrika, 1(3), 211–218.
- Eckstein, J., and Bertsekas, D.P. (1992), ‘On the Douglas-Rachford Splitting Method and the Proximal Point Algorithm for Maximal Monotone Operators’, Mathematical Programming, 55, 293–318.
- Fazel, M. (2002), Matrix Rank Minimization With Applications, Stanford, CA: Stanford University.
- Han, H., Ding, Y., Hao, K., and Hu, L. (2013), ‘Particle Filter for State Estimation of Jump Markov Nonlinear System With Application to Multi-Targets Tracking’, International Journal of Systems Science, 44(7), 1333–1343.
- Hiriart-Urruty, J.B., and Lemaréchal, C. (2001), Fundamentals of Convex Analysis, New York, NY: Springer-Verlag.
- Huang, D., and Meyn, S. (2010), ‘Feature Extraction for Universal Hypothesis Testing via Rank-Constrained Optimization’, in Proceedings of 2010 IEEE International Symposium on Information Theory, Austin, TX, pp. 1618–1622.
- Huisinga, W., Meyn, S.P., and Schütte, C. (2004), ‘Phase Transitions and Metastability in Markovian and Molecular Systems’, Annals of Applied Probability, 14(1), 419–458.
- Juang, B.H., and Rabiner, L.R. (1985), ‘A Probabilistic Distance Measure for Hidden Markov Models’, AT&T Technical Journal, 64(2), 391–408.
- Kowalczuk, Z., and Domzalski, M. (2012), ‘Optimal Asynchronous Estimation of 2D Gaussian-Markov Processes’, International Journal of Systems Science, 43(8), 1431–1440.
- Lafon, S., Keller, Y., and Coifman, R.R. (2006), ‘Data Fusion and Multicue Data Matching by Diffusion Maps’, IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(11), 1784–1797.
- Liu, Q., Tan, H., and Guo, X. (2012), ‘Denumerable Continuous-Time Markov Decision Processes With Multiconstraints on Average Costs’, International Journal of Systems Science, 43(3), 576–585.
- Meyn, S.P., and Tweedie, R.L. (2009), Markov Chains and Stochastic Stability (2nd ed.), Cambridge, UK: Cambridge University Press.
- Miao, G., Xu, S., and Zou, Y. (2013), ‘Necessary and Sufficient Conditions for Mean Square Consensus Under Markov Switching Topologies’, International Journal of Systems Science, 44(1), 178–186.
- Moreau, J.J. (1962), ‘Fonctions Convexes Duales et Points Proximaux Dans un Espace Hilbertien’, Comptes Rendus de l’Academie des Sciences (Paris), Série A, 255, 2897–2899.
- Puterman, M.L. (2005), Markov Decision Processes: Discrete Stochastic Dynamic Programming, Hoboken, NJ: John Wiley & Sons, Inc.
- Recht, B., Fazel, M., and Parrilo, P.A. (2010), ‘Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization’, SIAM Review, 52(3), 471–501.
- Setzer, S., Steidl, G., and Teuber, T. (2010), ‘Deblurring Poissonian Images by Split Bregman Techniques’, Journal of Visual Communication and Image Representation, 21, 193–199.
- Shi, J., and Malik, J. (2000), ‘Normalized Cuts and Image Segmentation’, IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8), 888–905.
- Sun, Y., and Mehta, P.G. (2010), ‘The Kullback-Leiber Rate Pseudo-Metric for Comparing Dynamical Systems’, IEEE Transactions on Automatic Control, 55(7), 1585–1598.
- Vidyasagar, M. (2010), ‘Reduced-Order Modeling of Markov and Hidden Markov Processes Via Aggregation’, in Proceedings of IEEE Conference of Decision and Control, pp. 1810–1815.
- Yin, G.G., and Zhang, Q. (1998), Continuous-Time Markov Chains and Applications: A Singular Perturbation Approach, New York: Springer-Verlag.