367
Views
4
CrossRef citations to date
0
Altmetric
Monte Carlo and Approximation Methods

Consensus Monte Carlo for Random Subsets Using Shared Anchors

, &
Pages 703-714 | Received 04 May 2019, Accepted 25 Feb 2020, Published online: 15 Apr 2020

References

  • Argiento, R., Guglielmi, A., and Pievatolo, A. (2010), “Bayesian Density Estimation and Model Selection Using Nonparametric Hierarchical Mixtures,” Computational Statistics & Data Analysis, 54, 816–832. DOI: 10.1016/j.csda.2009.11.002.
  • Barrios, E., Lijoi, A., Nieto-Barajas, L. E., and Prünster, I. (2013), “Modeling With Normalized Random Measure Mixture Models,” Statistical Science, 28, 313–334. DOI: 10.1214/13-STS416.
  • Blei, D. M., and Jordan, M. I. (2006), “Variational Inference for Dirichlet Process Mixtures,” Bayesian Analysis, 1, 121–143. DOI: 10.1214/06-BA104.
  • Broderick, T., Jordan, M. I., and Pitman, J. (2013), “Cluster and Feature Modeling From Combinatorial Stochastic Processes,” Statistical Science, 28, 289–312. DOI: 10.1214/13-STS434.
  • Broderick, T., Kulis, B., and Jordan, M. (2013), “MAD-Bayes: MAP-Based Asymptotic Derivations From Bayes,” in International Conference on Machine Learning, pp. 226–234.
  • Broderick, T., Pitman, J., and Jordan, M. I. (2013), “Feature Allocations, Probability Functions, and Paintboxes,” Bayesian Analysis, 8, 801–836. DOI: 10.1214/13-BA823.
  • Campbell, T., and Broderick, T. (2019), “Automated Scalable Bayesian Inference via Hilbert Coresets,” The Journal of Machine Learning Research, 20, 551–588.
  • de Blasi, P., Favaro, S., Lijoi, A., Mena, R. H., Prünster, I., and Ruggiero, M. (2015), “Are Gibbs-Type Priors the Most Natural Generalization of the Dirichlet Process?,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 37, 212–229. DOI: 10.1109/TPAMI.2013.217.
  • Doshi-Velez, F., Miller, K., Van Gael, J., and Teh, Y. W. (2009), “Variational Inference for the Indian Buffet Process,” in Artificial Intelligence and Statistics, pp. 137–144.
  • Doshi-Velez, F., Mohamed, S., Ghahramani, Z., and Knowles, D. A. (2009), “Large Scale Nonparametric Bayesian Inference: Data Parallelisation in the Indian Buffet Process,” in Advances in Neural Information Processing Systems, pp. 1294–1302.
  • Entezari, R., Craiu, R. V., and Rosenthal, J. S. (2018), “Likelihood Inflating Sampling Algorithm,” Canadian Journal of Statistics, 46, 147–175. DOI: 10.1002/cjs.11343.
  • Favaro, S., and Teh, Y. W. (2013), “MCMC for Normalized Random Measure Mixture Models,” Statistical Science, 28, 335–359. DOI: 10.1214/13-STS422.
  • Ge, H., Chen, Y., Wan, M., and Ghahramani, Z. (2015), “Distributed Inference for Dirichlet Process Mixture Models,” in Proceedings of the 32nd International Conference on Machine Learning, pp. 2276–2284.
  • Ghahramani, Z., and Griffiths, T. L. (2006), “Infinite Latent Feature Models and the Indian Buffet Process,” in Advances in Neural Information Processing Systems, pp. 475–482.
  • Ghoshal, S. (2010), “The Dirichlet Process, Related Priors and Posterior Asymptotics,” in Bayesian Nonparametrics, eds. N. L. Hjort, C. Holmes, P. Müller, and S. G. Walker, Cambridge: Cambridge University Press, pp. 22–34.
  • Hartigan, J. A. (1972), “Direct Clustering of a Data Matrix,” Journal of the American Statistical Association, 67, 123–129. DOI: 10.1080/01621459.1972.10481214.
  • Huang, Z., and Gelman, A. (2005), “Sampling for Bayesian Computation With Large Datasets,” Technical Report, Department of Statistics, Columbia University.
  • Huggins, J., Campbell, T., and Broderick, T. (2016), “Coresets for Scalable Bayesian Logistic Regression,” in Advances in Neural Information Processing Systems, pp. 4080–4088.
  • Kingman, J. F. C. (1978), “The Representation of Partition Structures,” Journal of the London Mathematical Society, 2, 374–380. DOI: 10.1112/jlms/s2-18.2.374.
  • Kunkel, D., and Peruggia, M. (2018), “Anchored Bayesian Gaussian Mixture Models,” arXiv no. 1805.08304.
  • Kurihara, K., Welling, M., and Teh, Y. W. (2007), “Collapsed Variational Dirichlet Process Mixture Models,” in IJCAI (Vol. 7), pp. 2796–2801.
  • Lau, J. W., and Green, P. J. (2007), “Bayesian Model-Based Clustering Procedures,” Journal of Computational and Graphical Statistics, 16, 526–558. DOI: 10.1198/106186007X238855.
  • LeCun, Y., Bottou, L., Bengio, Y., and Haffner, P. (1998), “Gradient-Based Learning Applied to Document Recognition,” Proceedings of the IEEE, 86, 2278–2324. DOI: 10.1109/5.726791.
  • Lee, J., Müller, P., Gulukota, K., and Ji, Y. (2015), “A Bayesian Feature Allocation Model for Tumor Heterogeneity,” The Annals of Applied Statistics, 9, 621–639. DOI: 10.1214/15-AOAS817.
  • Lijoi, A., Mena, R. H., and Prünster, I. (2005), “Hierarchical Mixture Modeling With Normalized Inverse-Gaussian Priors,” Journal of the American Statistical Association, 100, 1278–1291. DOI: 10.1198/016214505000000132.
  • Lijoi, A., Mena, R. H., and Prünster, I. (2007), “Controlling the Reinforcement in Bayesian Non-Parametric Mixture Models,” Journal of the Royal Statistical Society, Series B, 69, 715–740.
  • Lin, D. (2013), “Online Learning of Nonparametric Mixture Models via Sequential Variational Approximation,” in Advances in Neural Information Processing Systems, pp. 395–403.
  • Lo, A. Y. (1984), “On a Class of Bayesian Nonparametric Estimates: I. Density Estimates,” The Annals of Statistics, 12, 351–357. DOI: 10.1214/aos/1176346412.
  • MacEachern, S. N. (2000), “Dependent Dirichlet Processes,” Unpublished manuscript, Department of Statistics, The Ohio State University, pp. 1–40.
  • Mak, S., and Joseph, V. R. (2018), “Support Points,” The Annals of Statistics, 46, 2562–2592. DOI: 10.1214/17-AOS1629.
  • Minsker, S., Srivastava, S., Lin, L., and Dunson, D. (2014), “Scalable and Robust Bayesian Inference via the Median Posterior,” in International Conference on Machine Learning, pp. 1656–1664.
  • Neal, R. M. (2000), “Markov Chain Sampling Methods for Dirichlet Process Mixture Models,” Journal of Computational and Graphical Statistics, 9, 249–265. DOI: 10.2307/1390653.
  • Neiswanger, W., Wang, C., and Xing, E. (2013), “Asymptotically Exact, Embarrassingly Parallel MCMC,” arXiv no. 1311.4780.
  • Newton, M. A., Quintana, F. A., and Zhang, Y. (1998), “Nonparametric Bayes Methods Using Predictive Updating,” in Practical Nonparametric and Semiparametric Bayesian Statistics, eds. D. Dey, P. Müller, and D. Sinha, New York: Springer, pp. 45–61.
  • Ni, Y., Mueller, P., and Ji, Y. (2019), “Bayesian Double Feature Allocation for Phenotyping With Electronic Health Records,” Journal of the American Statistical Association, in press.
  • Ni, Y., Müller, P., Diesendruck, M., Williamson, S., Zhu, Y., and Ji, Y. (2019), “Scalable Bayesian Nonparametric Clustering and Classification,” Journal of Computational and Graphical Statistics, in press. DOI: 10.1080/10618600.2019.1624366.
  • Ni, Y., Müller, P., Shpak, M., and Ji, Y. (2019), “Parallel-Tempered Feature Allocation for Large-Scale Tumor Heterogeneity With Deep Sequencing Data,” Technical Report.
  • Pitman, J., and Yor, M. (1997), “The Two-Parameter Poisson-Dirichlet Distribution Derived From a Stable Subordinator,” The Annals of Probability, 25, 855–900. DOI: 10.1214/aop/1024404422.
  • Rabinovich, M., Angelino, E., and Jordan, M. I. (2015), “Variational Consensus Monte Carlo,” in Advances in Neural Information Processing Systems, pp. 1207–1215.
  • Rai, P., and Daume, H. (2011), “Beam Search Based Map Estimates for the Indian Buffet Process,” in Proceedings of the 28th International Conference on Machine Learning (ICML-11), Citeseer, pp. 705–712.
  • Reed, C., and Ghahramani, Z. (2013), “Scaling the Indian Buffet Process via Submodular Maximization,” in International Conference on Machine Learning, pp. 1013–1021.
  • Richardson, S., and Green, P. J. (1997), “On Bayesian Analysis of Mixtures With an Unknown Number of Components” (with discussion), Journal of the Royal Statistical Society, Series B, 59, 731–792. DOI: 10.1111/1467-9868.00095.
  • Rodriguez, A., Lenkoski, A., and Dobra, A. (2011), “Sparse Covariance Estimation in Heterogeneous Samples,” Electronic Journal of Statistics, 5, 981. DOI: 10.1214/11-EJS634.
  • Scott, S. L., Blocker, A. W., Bonassi, F. V., Chipman, H. A., George, E. I., and McCulloch, R. E. (2016), “Bayes and Big Data: The Consensus Monte Carlo Algorithm,” International Journal of Management Science and Engineering Management, 11, 78–88. DOI: 10.1080/17509653.2016.1142191.
  • Tank, A., Foti, N., and Fox, E. (2015), “Streaming Variational Inference for Bayesian Nonparametric Mixture Models,” in Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, pp. 968–976.
  • van der Maaten, L., and Hinton, G. (2008), “Visualizing Data Using t-SNE,” Journal of Machine Learning Research, 9, 2579–2605.
  • Wade, S., and Ghahramani, Z. (2018), “Bayesian Cluster Analysis: Point Estimation and Credible Balls” (with discussion), Bayesian Analysis, 13, 559–626. DOI: 10.1214/17-BA1073.
  • Wang, L., and Dunson, D. B. (2011), “Fast Bayesian Inference in Dirichlet Process Mixture Models,” Journal of Computational and Graphical Statistics, 20, 196–216. DOI: 10.1198/jcgs.2010.07081.
  • Wang, X., and Dunson, D. B. (2013), “Parallelizing MCMC via Weierstrass Sampler,” arXiv no. 1312.4605.
  • White, S., Kypraios, T., and Preston, S. (2015), “Piecewise Approximate Bayesian Computation: Fast Inference for Discretely Observed Markov Models Using a Factorised Posterior Distribution,” Statistics and Computing, 25, 289. DOI: 10.1007/s11222-013-9432-2.
  • Williamson, S. A., Dubey, A., and Xing, E. P. (2013), “Parallel Markov Chain Monte Carlo for Nonparametric Mixture Models,” in Proceedings of the 30th International Conference on International Conference on Machine Learning, pp. 98–106.
  • Xu, Y., Müller, P., Yuan, Y., Gulukota, K., and Ji, Y. (2015), “MAD Bayes for Tumor Heterogeneity—Feature Allocation With Exponential Family Sampling,” Journal of the American Statistical Association, 110, 503–514. DOI: 10.1080/01621459.2014.995794.
  • Zuanetti, D. A., Müller, P., Zhu, Y., Yang, S., and Ji, Y. (2019), “Bayesian Nonparametric Clustering for Large Data Sets,” Statistics and Computing, 29, 203–215. DOI: 10.1007/s11222-018-9803-9.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.