697
Views
4
CrossRef citations to date
0
Altmetric
Bayesian and Latent Variable Models

Bayesian Variational Inference for Exponential Random Graph Models

&
Pages 910-928 | Received 07 Dec 2018, Accepted 04 Mar 2020, Published online: 15 Apr 2020

References

  • Atchadé, Y. F., Lartillot, N., and Robert, C. (2013), “Bayesian Computation for Statistical Models With Intractable Normalizing Constants,” Brazilian Journal of Probability and Statistics, 27, 416–436. DOI: 10.1214/11-BJPS174.
  • Besag, J. (1974), “Spatial Interaction and the Statistical Analysis of Lattice Systems,” Journal of the Royal Statistical Society, Series B, 36, 192–236. DOI: 10.1111/j.2517-6161.1974.tb00999.x.
  • Bouranis, L., Friel, N., and Maire, F. (2017), “Efficient Bayesian Inference for Exponential Random Graph Models by Correcting the Pseudo-Posterior Distribution,” Social Networks, 50, 98–108. DOI: 10.1016/j.socnet.2017.03.013.
  • Bouranis, L., Friel, N., and Maire, F. (2018), “Bayesian Model Selection for Exponential Random Graph Models via Adjusted Pseudolikelihoods,” Journal of Computational and Graphical Statistics, 27, 516–528.
  • Burda, Y., Grosse, R., and Salakhutdinov, R. (2016), “Importance Weighted Autoencoders,” in Proceedings of the 4th International Conference on Learning Representations (ICLR).
  • Caimo, A., and Friel, N. (2011), “Bayesian Inference for Exponential Random Graph Models,” Social Networks, 33, 41–55. DOI: 10.1016/j.socnet.2010.09.004.
  • Caimo, A., and Friel, N. (2013), “Bayesian Model Selection for Exponential Random Graph Models,” Social Networks, 35, 11–24.
  • Caimo, A., and Friel, N. (2014), “Bergm: Bayesian Exponential Random Graphs in R,” Journal of Statistical Software, 61, 1–25.
  • Chen, J., and Luss, R. (2019), “Stochastic Gradient Descent With Biased but Consistent Gradient Estimators,” arXiv no. 1807.11880.
  • Chen, J., Ma, T., and Xiao, C. (2018), “Fastgcn: Fast Learning With Graph Convolutional Networks via Importance Sampling.” arXiv no. 1801.10247.
  • Chib, S., and Jeliazkov, I. (2001), “Marginal Likelihood From the Metropolis-Hastings Output,” Journal of the American Statistical Association, 96, 270–281. DOI: 10.1198/016214501750332848.
  • Gal, Y. (2016), “Uncertainty in Deep Learning,” PhD thesis, University of Cambridge.
  • Geyer, C. J., and Thompson, E. A. (1992), “Constrained Monte Carlo Maximum Likelihood for Dependent Data,” Journal of the Royal Statistical Society, Series B, 54, 657–699. DOI: 10.1111/j.2517-6161.1992.tb01443.x.
  • Hummel, R. M., Hunter, D. R., and Handcock, M. S. (2012), “Improving Simulation-Based Algorithms for Fitting ERGMs,” Journal of Computational and Graphical Statistics, 21, 920–939. DOI: 10.1080/10618600.2012.679224.
  • Hunter, D. R., Goodreau, S. M., and Handcock, M. S. (2008), “Goodness of Fit of Social Network Models,” Journal of the American Statistical Association, 103, 248–258. DOI: 10.1198/016214507000000446.
  • Hunter, D. R., Handcock, M. S., Butts, C. T., Goodreau, S. M., and Morris, M. (2008), “ergm: A Package to Fit, Simulate and Diagnose Exponential-Family Models for Networks,” Journal of Statistical Software, 24, 1–29. DOI: 10.18637/jss.v024.i03.
  • Kass, R. E., and Raftery, A. E. (1995), “Bayes Factors,” Journal of the American Statistical Association, 90, 773–795. DOI: 10.1080/01621459.1995.10476572.
  • Kingma, D. P., and Ba, J. (2014), “Adam: A Method for Stochastic Optimization,” arXiv no. 1412.6980.
  • Kingma, D. P., and Welling, M. (2014), “Auto-Encoding Variational Bayes,” in Proceedings of the 2nd International Conference on Learning Representations (ICLR).
  • Knowles, D. A., and Minka, T. (2011), “Non-Conjugate Variational Message Passing for Multinomial and Binary Regression,” in Advances in Neural Information Processing Systems (Vol. 24), eds. J. Shawe-Taylor, R. S. Zemel, P. L. Bartlett, F. Pereira, and K. Q. Weinberger, Curran Associates, Inc., pp. 1701–1709.
  • Kong, A., Liu, J. S., and Wong, W. H. (1994), “Sequential Imputations and Bayesian Missing Data Problems,” Journal of the American Statistical Association, 89, 278–288. DOI: 10.1080/01621459.1994.10476469.
  • Le, T. A., Kosiorek, A. R., Siddharth, N., Teh, Y. W., and Wood, F. (2019), “Revisiting Reweighted Wake-Sleep for Models With Stochastic Control Flow,” in Uncertainty in Artificial Intelligence (to appear).
  • Liang, F. (2010), “A Double Metropolis-Hastings Sampler for Spatial Models With Intractable Normalizing Constants,” Journal of Statistical Computation and Simulation, 80, 1007–1022. DOI: 10.1080/00949650902882162.
  • Liang, F., Jin, I. H., Song, Q., and Liu, J. S. (2016), “An Adaptive Exchange Algorithm for Sampling From Distributions With Intractable Normalizing Constants,” Journal of the American Statistical Association, 111, 377–393. DOI: 10.1080/01621459.2015.1009072.
  • Liu, J. S. (2004), Monte Carlo Strategies in Scientific Computing, New York: Springer-Verlag.
  • Liu, Q., and Pierce, D. A. (1994), “A Note on Gauss-Hermite Quadrature,” Biometrika, 81, 624–629. DOI: 10.2307/2337136.
  • Lyne, A.-M., Girolami, M., Atchad, Y., Strathmann, H., and Simpson, D. (2015), “On Russian Roulette Estimates for Bayesian Inference With Doubly-Intractable Likelihoods,” Statistical Science, 30, 443–467. DOI: 10.1214/15-STS523.
  • Martino, L., Elvira, V., and Louzada, F. (2017), “Effective Sample Size for Importance Sampling Based on Discrepancy Measures,” Signal Processing, 131, 386–401. DOI: 10.1016/j.sigpro.2016.08.025.
  • Møller, J., Pettitt, A. N., Reeves, R., and Berthelsen, K. K. (2006), “An Efficient Markov Chain Monte Carlo Method for Distributions With Intractable Normalising Constants,” Biometrika, 93, 451–458. DOI: 10.1093/biomet/93.2.451.
  • Murray, I., Ghahramani, Z., and MacKay, D. J. C. (2006), “MCMC for Doubly-Intractable Distributions,” in Proceedings of the Twenty-Second Conference on Uncertainty in Artificial Intelligence, UAI’06, Arlington, VA: AUAI Press, pp. 359–366.
  • Neal, R. M. (2001), “Annealed Importance Sampling,” Statistics and Computing, 11, 125–139. DOI: 10.1023/A:1008923215028.
  • Ormerod, J. T., and Wand, M. P. (2012), “Gaussian Variational Approximate Inference for Generalized Linear Mixed Models,” Journal of Computational and Graphical Statistics, 21, 2–17. DOI: 10.1198/jcgs.2011.09118.
  • Park, J., and Haran, M. (2018), “Bayesian Inference in the Presence of Intractable Normalizing Functions,” Journal of the American Statistical Association, 113, 1372–1390. DOI: 10.1080/01621459.2018.1448824.
  • Pearson, M., and Michell, L. (2000), “Smoke Rings: Social Network Analysis of Friendship Groups, Smoking and Drug-Taking,” Drugs: Education, Prevention and Policy, 7, 21–37. DOI: 10.1080/713660095.
  • Rezende, D. J., Mohamed, S., and Wierstra, D. (2014), “Stochastic Backpropagation and Approximate Inference in Deep Generative Models,” in Proceedings of the 31st International Conference on Machine Learning, JMLR Workshop and Conference Proceedings, eds. E. P. Xing and T. Jebara, pp. 1278–1286.
  • Robbins, H., and Monro, S. (1951), “A Stochastic Approximation Method,” The Annals of Mathematical Statistics, 22, 400–407. DOI: 10.1214/aoms/1177729586.
  • Roeder, G., Wu, Y., and Duvenaud, D. K. (2017), “Sticking the Landing: Simple, Lower-Variance Gradient Estimators for Variational Inference,” in Advances in Neural Information Processing Systems (Vol. 30), eds. I. Guyon, U. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett.
  • Salgado, H., Santos-Zavaleta, A., Gama-Castro, S., Millán-Zárate, D., Díaz-Peredo, E., Sánchez-Solano, F., Pérez-Rueda, E., Bonavides-Martínez, C., and Collado-Vides, J. (2001), “Regulondb (Version 3.2): Transcriptional Regulation and Operon Organization in Escherichia coli K-12,” Nucleic Acids Research, 29, 72–74. DOI: 10.1093/nar/29.1.72.
  • Saul, Z. M., and Filkov, V. (2007), “Exploring Biological Network Structure Using Exponential Random Graph Models,” Bioinformatics, 23, 2604–2611. DOI: 10.1093/bioinformatics/btm370.
  • Shen-Orr, S., Milo, R., Mangan, S., and Alon, U. (2002), “Network Motifs in the Transcriptional Regulation Network of Escherichia coli,” Nature Genetics, 31, 64–68. DOI: 10.1038/ng881.
  • Snijders, T. A. (2002), “Markov Chain Monte Carlo Estimation of Exponential Random Graph Models,” Journal of Social Structure, 3, 1–40.
  • Spall, J. C. (2003), Introduction to Stochastic Search and Optimization: Estimation, Simulation and Control, Hoboken, NJ: Wiley.
  • Strauss, D., and Ikeda, M. (1990), “Pseudolikelihood Estimation for Social Networks,” Journal of the American Statistical Association, 85, 204–212. DOI: 10.1080/01621459.1990.10475327.
  • Tadić, V. B., and Doucet, A. (2017), “Asymptotic Bias of Stochastic Gradient Search,” Annals of Applied Probability, 27, 3255–3304. DOI: 10.1214/16-AAP1272.
  • Tan, L. S. L. (2020), “Use of Model Reparametrization to Improve Variational Bayes,” arXiv no. 1805.07267.
  • Tan, L. S. L., and Nott, D. J. (2013), “Variational Inference for Generalized Linear Mixed Models Using Partially Non-Centered Parametrizations,” Statistical Science, 28, 168–188. DOI: 10.1214/13-STS418.
  • Tan, L. S. L., and Nott, D. J. (2018), “Gaussian Variational Approximation With Sparse Precision Matrices,” Statistics and Computing, 28, 259–275.
  • Titsias, M., and Lázaro-Gredilla, M. (2014), “Doubly Stochastic Variational Bayes for Non-Conjugate Inference,” in Proceedings of the 31st International Conference on Machine Learning (ICML-14), pp. 1971–1979.
  • Tran, M.-N., Nott, D. J., and Kohn, R. (2017), “Variational Bayes With Intractable Likelihood,” Journal of Computational and Graphical Statistics, 26, 873–882. DOI: 10.1080/10618600.2017.1330205.
  • van Duijn, M. A., Gile, K. J., and Handcock, M. S. (2009), “A Framework for the Comparison of Maximum Pseudo-Likelihood and Maximum Likelihood Estimation of Exponential Family Random Graph Models,” Social Networks, 31, 52–62. DOI: 10.1016/j.socnet.2008.10.003.
  • Wand, M. P. (2014), “Fully Simplified Multivariate Normal Updates in Non-Conjugate Variational Message Passing,” Journal of Machine Learning Research, 15, 1351–1369.
  • Williams, R. J. (1992), “Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning,” Machine Learning, 8, 229–256. DOI: 10.1007/BF00992696.
  • Winn, J., and Bishop, C. M. (2005), “Variational Message Passing,” Journal of Machine Learning Research, 6, 661–694.
  • Xu, M., Quiroz, M., Kohn, R., and Sisson, S. A. (2019), “Variance Reduction Properties of the Reparameterization Trick,” in Proceedings of Machine Learning Research, Proceedings of Machine Learning Research (PMLR) (Vol. 89), eds. K. Chaudhuri and M. Sugiyama, pp. 2711–2720.
  • Zachary, W. W. (1977), “An Information Flow Model for Conflict and Fission in Small Groups,” Journal of Anthropological Research, 33, 452–473. DOI: 10.1086/jar.33.4.3629752.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.