542
Views
23
CrossRef citations to date
0
Altmetric
Short Technical Notes

Statistically Efficient Thinning of a Markov Chain Sampler

Pages 738-744 | Received 01 Sep 2016, Published online: 01 Aug 2017

References

  • Gamerman, D., and Lopes, H. F. (2006), Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference (2nd ed.), Boca Raton, FL: CRC Press.
  • Gelman, A., Roberts, G., and Gilks, W. (1996), “Efficient Metropolis Jumping Rules,” in Bayesian Statistics 5, eds. J. M. Bernardo, J. O. Berger, and A. F. M. Smith, Oxford: Oxford University Press, pp. 599–608.
  • Geyer, C. J. (1991), “Markov Chain Monte Carlo Maximum Likelihood,” in Proceedings of the 23rd Symposium on the Interface, Interface Foundation of North America, pp. 153–163.
  • ——— (1992), “Practical Markov Chain Monte Carlo,” Statistical Science, 7, 473–483.
  • Jackman, S. (2009), Bayesian Analysis for the Social Sciences, New York: Wiley.
  • Link, W. A., and Eaton, M. J. (2011), “On Thinning of Chains in MCMC,” Methods in Ecology and Evolution, 3, 112–115.
  • MacEachern, S. N., and Berliner, L. M. (1994), “Subsampling the Gibbs Sampler,” The American Statistician, 48, 188–190.
  • Neal, R. M. (1993), “Probabilistic Inference using Markov Chain Monte Carlo Methods,” Technical Report CRG-TR-93-1, University of Toronto.
  • Newman, M. E. J., and Barkema, G. T. (1999), Monte Carlo Methods in Statistical Physics, New York: Oxford University Press.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.