672
Views
10
CrossRef citations to date
0
Altmetric
Original Articles

Variational Approximation for Mixtures of Linear Mixed Models

Pages 564-585 | Received 01 Dec 2011, Published online: 28 Apr 2014

References

  • Armagan, A., and Dunson, D. (2011), “Sparse Variational Analysis of Linear Mixed Models for Large Data Sets,” Statistics and Probability Letters, 81, 1056–1062.
  • Attias, H. (1999), “Inferring Parameters and Structure of Latent Variable Models by Variational Bayes,” in Proceedings of the 15th Conference on Uncertainty in Artificial Intelligence, pp. 21–30.
  • Biernacki, C., Celeux, G., and Govaert, G. (2003), “Choosing Starting Values for the EM Algorithm for Getting the Highest Likelihood in Multivariate Gaussian Mixture Models,” Computational Statistics and Data Analysis, 41, 561–575.
  • Blei, D.M., and Jordan, M.I. (2006), “Variational Inference for Dirichlet Process Mixtures,” Bayesian Analysis, 1, 121–144.
  • Braun, M., and McAuliffe, J. (2010), “Variational Inference for Large-Scale Models of Discrete Choice,” Journal of the American Statistical Association, 105, 324–335.
  • Booth, J.G., Casella, G., and Hobert, J.P. (2008), “Clustering Using Objective Functions and Stochastic Search,” Journal of the Royal Statistical Society, Series B, 70, 119–139.
  • Celeux, G., Martin, O., and Lavergne, C. (2005), “Mixture of Linear Mixed Models for Clustering Gene Expression Profiles From Repeated Microarray Experiments,” Statistical Modelling, 5, 243–267.
  • Chen, M.H., Shao, Q.M., and Ibrahim, J.G. (2000), Monte Carlo Methods in Bayesian Computation, New York: Springer.
  • Coke, G., and Tsao, M. (2010), “Random Effects Mixture Models for Clustering Electrical Load Series,” Journal of Time Series Analysis, 31, 451–464.
  • Constantinopoulos, C., and Likas, A. (2007), “Unsupervised Learning of Gaussian Mixtures Based on Variational Component Splitting,” IEEE Transactions on Neural Networks, 18, 745–755.
  • Corduneanu, A., and Bishop, C.M. (2001), “Variational Bayesian Model Selection for Mixture Distributions,” in Proceedings of 8th International Conference on Artificial Intelligence and Statistics, pp. 27–34.
  • Dempster, A.P., Laird, N.M., and Rubin, D.B. (1977), “Maximum Likelihood From Incomplete Data via the EM Algorithm,” Journal of the Royal Statistical Society, Series B, 39, 1–38.
  • Finley, A.O., Banerjee, S., and McRoberts, R.E. (2008), “A Bayesian Approach to Multi-Source Forest Area Estimation,” Environmental and Ecological Statistics, 15, 241–258.
  • Fong, Y., Rue, H., and Wakefield, J. (2010), “Bayesian Inference for Generalised Linear Mixed Models,” Biostatistics, 11, 397–412.
  • Gelfand, A.E., Sahu, S.K., and Carlin, B.P. (1995), “Efficient Parametrisations for Normal Linear Mixed Models,” Biometrika, 82, 479–488.
  • Hubert, L., and Arabie, P. (1985), “Comparing Partitions,” Journal of Classification, 2, 193–218.
  • Ideker, T., Thorsson, V., Ranish, J.A., Christmas, R., Buhler, J., Eng, J.K., Bumgarner, R., Goodlett, D.R., Aebersold, R., and Hood, L. (2001), “Integrated Genomic and Proteomic Analyses of a Systematically Perturbed Metabolic Network,” Science, 292, 929–934.
  • Jacobs, R.A., Jordan, M.I., Nowlan, S.J., and Hinton, G.E. (1991), “Adaptive Mixtures of Local Experts,” Neural Computation, 3, 79–87.
  • Jordan, M.I., Ghahramani, Z., Jaakkola, T.S., and Saul, L.K. (1999), “An Introduction to Variational Methods for Graphical Models,” Machine Learning, 37, 183–233.
  • Luan, Y., and Li, H. (2003), “Clustering of Time-Course Gene Expression Data Using a Mixed-Effects Model With B-Splines,” Bioinformatics, 19, 474–482.
  • McGrory, C.A., and Titterington, D.M. (2007), “Variational Approximations in Bayesian Model Selection for Finite Mixture Distributions,” Computational Statistics and Data Analysis, 51, 5352–5367.
  • McLachlan, G.J., Do, K.A., and Ambroise, C. (2004), Analyzing Microarray Gene Expression Data, New York: Wiley.
  • Meng, X.L. (1994), “On the Rate of Convergence of the ECM Algorithm,” The Annals of Statistics, 22, 326–339.
  • Ng, S.K., McLachlan, G.J., Wang, K., Ben-Tovim Jones, L., and Ng, S.-W. (2006), “A Mixture Model With Random-Effects Components for Clustering Correlated Gene-Expression Profiles,” Bioinformatics, 22, 1745–1752.
  • Nott, D.J., Tan, S.L., Villani, M., and Kohn, R. (2012), “Regression Density Estimation With Variational Methods and Stochastic Approximation,” Journal of Computational and Graphical Statistics, 21, 797–820.
  • Ormerod, J.T., and Wand, M.P. (2010), “Explaining Variational Approximations,” The American Statistician, 64, 140–153.
  • ——— (2012), “Gaussian Variational Approximate Inference for Generalized Linear Mixed Models,” Journal of Computational and Graphical Statistics, 21, 2–17.
  • Papaspiliopoulos, O., Roberts, G.O., and Sköld, M. (2007), “A General Framework for the Parametrization of Hierarchical Models,” Statistical Science, 22, 59–73.
  • Sahu, S.K., and Roberts, G.O. (1999), “On Convergence of the EM Algorithm and the Gibbs Sampler,” Statistics and Computing, 9, 55–64.
  • Scharl, T., Grün, B., and Leisch, F. (2010), “Mixtures of Regression Models for Time Course Gene Expression Data: Evaluation of Initialization and Random Effects,” Bioinformatics, 26, 370–377.
  • Schwarz, G. (1978), “Estimating the Dimension of a Model,” The Annals of Statistics, 6, 461–464.
  • Spellman, P.T., Sherlock, G., Zhang, M.Q., Iyer, V.R., Anders, K., Eisen, M.B., Brown, P.O., Botstein, D., and Futcher, B. (1998), “Comprehensive Identification of Cell Cycle-Regulated Genes of the Yeast Saccharomyces Cerevisiae by Microarray Hybridization,” Molecular Biology of the Cell, 9, 3273–3297.
  • Ueda, N., and Ghahramani, Z. (2002), “Bayesian Model Search for Mixture Models Based on Optimizing Variational Bounds,” Neural Networks, 15, 1223–1241.
  • Venables, W.N., and Ripley, B.D. (2002), Modern Applied Statistics with S (4th ed.), New York: Springer.
  • Verbeek, J.J., Vlassis, N., and Kröse, B. (2003), “Efficient Greedy Learning of Gaussian Mixture Models,” Neural Computation, 15, 469–485.
  • Wand, M.P. (2002), “Vector Differential Calculus in Statistics,” The American Statistician, 56, 55–62.
  • Wang, B., and Titterington, D.M. (2005), “Inadequacy of Interval Estimates Corresponding to Variational Bayesian Approximations,” in Proceedings of the 10th International Workshop on Artificial Intelligence, pp. 373–380.
  • Waterhouse, S., MacKay, D., and Robinson, T. (1996), “Bayesian Methods for Mixtures of Experts,” Advances in Neural Information Processing Systems, 8, 351–357.
  • Winn, J., and Bishop, C.M. (2005), “Variational Message Passing,” Journal of Machine Learning Research, 6, 661–694.
  • Wu, B., McGrory, C.A., and Pettitt, A.N. (2012), “A New Variational Bayesian Algorithm With Application to Human Mobility Pattern Modeling,” Statistics and Computing, 22, 185–203.
  • Yeung, K.Y., Medvedovic, M., and Bumgarner, R.E. (2003), “Clustering Gene-Expression Data With Repeated Measurements,” Genome Biology, 4, R34.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.