46
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Fast Computer Model Calibration using Annealed and Transformed Variational Inference

, &
Received 25 Nov 2022, Accepted 10 May 2024, Accepted author version posted online: 08 Jul 2024
Accepted author version

References

  • Akiyama, K., A. Alberdi, W. Alef, J. C. Algaba, R. Anantua, K. Asada, R. Azulay, U. Bach, A.-K. Baczko, D. Ball, et al. (2022). First sagittarius a* event horizon telescope results. iv. variability, morphology, and black hole mass. The Astrophysical Journal Letters 930 (2), L15.
  • Arenz, O., M. Zhong, and G. Neumann (2020). Trust-region variational inference with gaussian mixture models. J. Mach. Learn. Res. 21, 163–1.
  • Bayarri, M., J. Berger, J. Cafeo, G. Garcia-Donato, F. Liu, J. Palomo, R. Parthasarathy, R. Paulo, J. Sacks, and D. Walsh (2007). Computer model validation with functional output. the Annals of Statistics 35 (5), 1874–1906.
  • Beaumont, M. A., W. Zhang, and D. J. Balding (2002). Approximate Bayesian computation in population genetics. Genetics 162 (4), 2025–2035.
  • Berg, R. v. d., L. Hasenclever, J. M. Tomczak, and M. Welling (2018). Sylvester normalizing flows for variational inference. arXiv preprint arXiv:1803.05649 .
  • Bhatnagar, S., W. Chang, and S. K. J. Wang (2022). Computer model calibration with time series data using deep learning and quantile regression. SIAM/ASA Journal on Uncertainty Quantification 10 (1), 1–26.
  • Blei, D. M., A. Kucukelbir, and J. D. McAuliffe (2017). Variational inference: A review for statisticians. Journal of the American statistical Association 112 (518), 859–877.
  • Bond-Taylor, S., A. Leach, Y. Long, and C. G. Willcocks (2021). Deep generative modelling: A comparative review of vaes, gans, normalizing flows, energy-based and autoregressive models. arXiv preprint arXiv:2103.04922 .
  • Bornschein, J. and Y. Bengio (2014). Reweighted wake-sleep. arXiv preprint arXiv:1406.2751 .
  • Brynjarsdóttir, J. and A. O’Hagan (2014). Learning about physical parameters: The importance of model discrepancy. Inverse problems 30 (11), 114007.
  • Chang, W., M. Haran, P. Applegate, and D. Pollard (2016). Calibrating an ice sheet model using high-dimensional binary spatial data. Journal of American Statistical Association 111 (513), 57–72.
  • Chang, W., M. Haran, R. Olson, and K. Keller (2014). Fast dimension-reduced climate model calibration and the effect of data aggregation. AOAS 8 (2), 649–673.
  • Dietz, K. (1967). Epidemics and rumours: A survey. Journal of the Royal Statistical Society. Series A (General), 505–528.
  • Durkan, C., A. Bekasov, I. Murray, and G. Papamakarios (2019). Neural spline flows. Advances in neural information processing systems 32.
  • Gochis, D., M. Barlage, A. Dugger, K. FitzGerald, L. Karsten, M. McAllister, J. McCreight, J. Mills, A. RafieeiNasab, L. Read, et al. (2018). The WRF-Hydro modeling system technical description,(version 5.0). NCAR Technical Note 107.
  • Gregory, J. A. and R. Delbourgo (1982). Piecewise rational quadratic interpolation to monotonic data. IMA Journal of Numerical Analysis 2 (2), 123–130.
  • Gu, M. (2019). Jointly Robust Prior for Gaussian Stochastic Process in Emulation, Calibration and Variable Selection. Bayesian Analysis 14 (3), 857 – 885.
  • Higdon, D., J. Gattiker, B. Williams, and M. Rightley (2008). Computer model calibration using high-dimensional output. Journal of American Statistical Association 103 (482), 570–583.
  • Hwang, Y., H. J. Kim, W. Chang, C. Hong, and S. N. MacEachern (2021). A bayesian solution to inverse problem for circadian cycles. arXiv preprint arXiv:2110.10604 .
  • Jerfel, G., S. Wang, C. Wong-Fannjiang, K. A. Heller, Y. Ma, and M. I. Jordan (2021). Variational refinement for importance sampling using the forward kullback-leibler divergence. In Uncertainty in Artificial Intelligence, pp. 1819–1829. PMLR.
  • Kennedy, M. C. and A. O’Hagan (2001). Bayesian calibration of computer models. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 63 (3), 425–464.
  • Kermack, W. O. and A. G. McKendrick (1927). A contribution to the mathematical theory of epidemics. Proceedings of the royal society of london. Series A, Containing papers of a mathematical and physical character 115 (772), 700–721.
  • Kingma, D. P., T. Salimans, R. Jozefowicz, X. Chen, I. Sutskever, and M. Welling (2016). Improved variational inference with inverse autoregressive flow. Advances in neural information processing systems 29.
  • Kingma, D. P. and M. Welling (2013). Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 .
  • Kingma, D. P. and M. Welling (2014). Auto-encoding variational bayes.
  • Kobyzev, I., S. J. Prince, and M. A. Brubaker (2020). Normalizing flows: An introduction and review of current methods. IEEE transactions on pattern analysis and machine intelligence 43 (11), 3964–3979.
  • Kucukelbir, A., D. Tran, R. Ranganath, A. Gelman, and D. M. Blei (2017). Automatic differentiation variational inference. Journal of machine learning research .
  • Li, Y. and R. E. Turner (2016). Rényi divergence variational inference. Advances in neural information processing systems 29.
  • Mandt, S., J. McInerney, F. Abrol, R. Ranganath, and D. Blei (2016). Variational tempering. In Artificial intelligence and statistics, pp. 704–712. PMLR.
  • Marin, J.-M., P. Pudlo, C. P. Robert, and R. J. Ryder (2012). Approximate Bayesian computational methods. Statistics and Computing 22 (6), 1167–1180.
  • Neal, R. M. (2001). Annealed importance sampling. Statistics and computing 11 (2), 125–139.
  • Nielsen, D., P. Jaini, E. Hoogeboom, O. Winther, and M. Welling (2020). Survae flows: Surjections to bridge the gap between vaes and flows. Advances in Neural Information Processing Systems 33, 12685–12696.
  • Olson, R. and W. Chang (2013). Mathematical framework for a separable gaussian process emulator. Technical report, Pennsylvania State University. available at www.scrimhub.org/resources/stilt/Olson_and_Chang_2013_Stilt_Emulator_Technical_Report.pdf.
  • Olson, R., K. L. Ruckert, W. Chang, K. Keller, M. Haran, and S.-I. An (2018). Stilt: Easy Emulation of Time Series AR(1) Computer Model Output in Multidimensional Parameter Space. The R Journal 10 (2), 209–225.
  • Papamakarios, G., E. Nalisnick, D. J. Rezende, S. Mohamed, and B. Lakshminarayanan (2021). Normalizing flows for probabilistic modeling and inference. Journal of Machine Learning Research 22 (57), 1–64.
  • Parashar, U. D., C. J. Gibson, J. S. Bresee, and R. I. Glass (2006). Rotavirus and severe childhood diarrhea. Emerging infectious diseases 12 (2), 304.
  • Park, J., J. Goldstein, M. Haran, and M. Ferrari (2017). An ensemble approach to predicting the impact of vaccination on rotavirus disease in niger. Vaccine 35 (43), 5835–5841.
  • Plumlee, M. (2017). Bayesian calibration of inexact computer models. Journal of the American Statistical Association 112 (519), 1274–1285.
  • Plumlee, M. (2019). Computer model calibration with confidence and consistency. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 81 (3), 519–545.
  • Ranganath, R., D. Tran, J. Altosaar, and D. Blei (2016). Operator variational inference. Advances in Neural Information Processing Systems 29.
  • Regli, J.-B. and R. Silva (2018). Alpha-beta divergence for variational inference. arXiv preprint arXiv:1805.01045 .
  • Rezende, D. and S. Mohamed (2015). Variational inference with normalizing flows. In International conference on machine learning, pp. 1530–1538. PMLR.
  • Sacks, J., W. J. Welch, T. J. Mitchell, and H. P. Wynn (1989). Design and analysis of computer experiments. Statistical science 4 (4), 409–423.
  • Salter, J. M., D. B. Williamson, J. Scinocca, and V. Kharin (2019). Uncertainty quantification for computer models with spatial output using calibration-optimal bases. Journal of American Statistical Association 114 (528), 1800–1824.
  • Shibli, M., S. Gooch, H. Lewis, and D. Tyrrell (1971). Common colds on tristan da cunha. Epidemiology & Infection 69 (2), 255–262.
  • Smith, M. S., R. Loaiza-Maya, and D. J. Nott (2020). High-dimensional copula variational approximation through transformation. Journal of Computational and Graphical Statistics 29 (4), 729–743.
  • Tuo, R. and C. J. Wu (2015). Efficient calibration for imperfect computer models. the Annals of Statistics 43 (6), 2331–2352.
  • Turner, R. E. and M. Sahani (2011). Two problems with variational expectation maximisation for time-series models. In D. Barber, T. Cemgil, and S. Chiappa (Eds.), Bayesian Time series models, Chapter 5, pp. 109–130. Cambridge University Press.
  • Vehtari, A., D. Simpson, A. Gelman, Y. Yao, and J. Gabry (2015). Pareto smoothed importance sampling. arXiv preprint arXiv:1507.02646 .
  • Vihola, M. (2012). Robust adaptive metropolis algorithm with coerced acceptance rate. Statistics and Computing 22, 997–1008.
  • Wang, J., C. Wang, V. Rao, A. Orr, E. Yan, and R. Kotamarthi (2019). A parallel workflow implementation for pest version 13.6 in high-performance computing for wrf-hydro version 5.0: a case study over the midwestern united states. Geoscientific Model Development 12 (8), 3523—-3539.
  • Zhang, C., J. Bütepage, H. Kjellström, and S. Mandt (2018). Advances in variational inference. IEEE transactions on pattern analysis and machine intelligence 41 (8), 2008–2026.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.