448
Views
3
CrossRef citations to date
0
Altmetric
Articles

Mixing of MCMC algorithms

Pages 2261-2279 | Received 11 Mar 2019, Accepted 30 Apr 2019, Published online: 09 May 2019

References

  • Mykland P, Tierney L, Yu B. Regeneration in Markov chain samplers. J Am Stat Assoc. 1995;90(429):233–241. doi: 10.1080/01621459.1995.10476507
  • Roberts GO, Tweedie RL. Bounds on regeneration times and convergence rates for Markov chains. Stoch Proc Appl. 1999;80:211–229. See also corrigendum Stoch. Proc Appl. 2001;91:337–338. doi: 10.1016/S0304-4149(98)00085-4
  • Roberts GO, Rosenthal JS. Optimal scaling of discrete approximations to Langevin diffusions. J Roy Stat Soc B. 1998;60:255–268. doi: 10.1111/1467-9868.00123
  • Meyn SP, Tweedie RL. Markov chains and stochastic stability. London: Springer-Verlag; 1993.
  • Geyer CJ. Practical Markov chain Monte Carlo (with discussion). Stat Sci. 1992;7:473–483. doi: 10.1214/ss/1177011137
  • Flegal JM, Jones GL. Batch means and spectral variance estimators in Markov chain Monte Carlo. Ann Stat. 2010;38:1034–1070. doi: 10.1214/09-AOS735
  • Roberts GO, Gelman A, Gilks WR. Weak convergence and optimal scaling of random walk Metropolis algorithms. Ann Appl Probab. 1997;7:110–120. doi: 10.1214/aoap/1034625254
  • Garthwaite PH, Fan Y, Sisson SA. Adaptive optimal scaling of Metropolis–Hastings algorithms using the Robbins–Monro process. Commun Stat Theory Methods. 2016;45(17):5098–5111. doi: 10.1080/03610926.2014.936562
  • Diaconis P, Stroock D. Geometric bounds for Eigenvalues of Markov chains. Ann Appl Probab. 1991;1(1):36–61. doi: 10.1214/aoap/1177005980
  • Giordani P, Kohn R. Adaptive independent Metropolis-Hastings by fast estimation of mixtures of normals. J Comput Graph Stat. 2010;19(2):243–259. doi: 10.1198/jcgs.2009.07174
  • Rosenthal JS, Rosenthal P. Spectral bounds for certain two-factor non-reversible MCMC algorithms. Electron Commun Probab. 2015;20, article 91. doi: 10.1214/ECP.v20-4528
  • Holden L. Geometric convergence of the Metropolis-Hastings simulation algorithm. Stat Probab Lett. 1998;39(4):371–377. doi: 10.1016/S0167-7152(98)00096-0
  • Cowles MK, Carlin BP. Markov chain Monte Carlo convergence diagnostics: a comparative review. J Am Stat Assoc. 1996;91(434):883–904. doi: 10.1080/01621459.1996.10476956
  • Rosenthal JS. Optimal proposal distributions and adaptive MCMC. In: Brooks S, Gelman A, Jones G, editors. Handbook of Markov chain Monte Carlo. New York: Chapman; 2010. p. 93–112.
  • Gelman A, Roberts GO, Gilks WR. Efficient Metropolis jumping rules. In: Bernardo J., et al., editor. Bayesian statistics 5. Oxford: Oxford University Press; 1996. p. 599–607.
  • Roberts GO, Rosenthal JS. Optimal scaling for various Metropolis-Hastings algorithms. Stat Sci. 2001;16(4):351–367. doi: 10.1214/ss/1015346320
  • Neal P, Roberts G. Optimal scaling for partially updating MCMC algorithms. Ann Appl Probab. 2006;16(2):475–515. doi: 10.1214/105051605000000791
  • Aldrin M, Holden M, Guttorp P, et al. Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations of hemispheric temperatures and global ocean heat content. Environmetrics. 2012 Feb. doi: 10.1002/env.2140
  • Skeie RB, Berntsen T, Aldrin M, et al. A lower and more constrained estimate of climate sensitivity using updated observations and detailed radiate forcing time series. Earth Syst Dynam. 2014;5:139–175. doi: 10.5194/esd-5-139-2014

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.