3,564
Views
6
CrossRef citations to date
0
Altmetric
Articles

Deep Gaussian Process Emulation using Stochastic Imputation

ORCID Icon, &
Pages 150-161 | Received 02 Jan 2022, Accepted 08 Aug 2022, Published online: 12 Oct 2022

References

  • Beck, J., and Guillas, S. (2016), “Sequential Design with Mutual Information for Computer Experiments (MICE): Emulation of a Tsunami Model,” SIAM/ASA Journal on Uncertainty Quantification, 4, 739–766. DOI: 10.1137/140989613.
  • Bui, T., Hernández-Lobato, D., Hernandez-Lobato, J., Li, Y., and Turner, R. (2016), “Deep Gaussian Processes for Regression using Approximate Expectation Propagation,” in International Conference on Machine Learning, pp. 1472–1481.
  • Capriotti, L., Jiang, Y., and Macrina, A. (2017), “AAD and Least-Square Monte Carlo: Fast Bermudan-Style Options and XVA Greeks,” Algorithmic Finance, 6, 35–49. DOI: 10.3233/AF-170201.
  • Celeux, G., Chauveau, D., and Diebolt, J. (1996), “Stochastic Versions of the EM Algorithm: An Experimental Study in the Mixture Case,” Journal of Statistical Computation and Simulation, 55, 287–314. DOI: 10.1080/00949659608811772.
  • Celeux, G., and Diebolt, J. (1985), “The SEM Algorithm: A Probabilistic Teacher Algorithm Derived from the EM Algorithm for the Mixture Problem,” Computational Statistics Quarterly, 2, 73–82.
  • Damianou, A., and Lawrence, N. (2013), “Deep Gaussian Processes, in Artificial Intelligence and Statistics, pp. 207–215.
  • Diebolt, J., and Ip, E. H. S. (1996), “Stochastic EM: Method and Application,” in Markov Chain Monte Carlo in Practice, pp. 259–273, Springer.
  • Dutordoir, V., Salimbeni, H., Hambro, E., McLeod, J., Leibfried, F., Artemev, A., van der Wilk, M., Deisenroth, M. P., Hensman, J., and John, S. (2021), “GPflux: A Library for Deep Gaussian Processes,” arXiv:2104.05674.
  • Duvenaud, D., Rippel, O., Adams, R., and Ghahramani, Z. (2014), “Avoiding Pathologies in Very Deep Networks,” in Artificial Intelligence and Statistics, pp. 202–210.
  • Goldberg, P. W., Williams, C. K., and Bishop, C. M. (1997), “Regression with Input-Dependent Noise: A Gaussian Process Treatment,” Advances in Neural Information Processing Systems, 10, 493–499.
  • Gramacy, R. B., and Lee, H. K. H. (2008), “Bayesian Treed Gaussian Process Models with an Application to Computer Modeling,” Journal of the American Statistical Association, 103, 1119–1130. DOI: 10.1198/016214508000000689.
  • Gu, M., Palomo, J., and Berger, J. O. (2018), “RobustGaSP: Robust Gaussian Stochastic Process Emulation in R,” arXiv:1801.01874.
  • Havasi, M., Hernández-Lobato, J. M., and Murillo-Fuentes, J. J. (2018), “Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo,” in Advances in Neural Information Processing Systems, pp. 7506–7516.
  • Hebbal, A., Brevault, L., Balesdent, M., Talbi, E.-G., and Melab, N. (2021), “Bayesian Optimization using Deep Gaussian Processes with Applications to Aerospace System Design,” Optimization and Engineering, 22, 321–361. DOI: 10.1007/s11081-020-09517-8.
  • Heston, S. L. (1993), “A Closed-Form Solution for Options with Stochastic Volatility with Applications to Bond and Currency Options,” The Review of Financial Studies, 6, 327–343. DOI: 10.1093/rfs/6.2.327.
  • Ip, E. H. S. (1994), “A Stochastic EM Estimator in the Presence of Missing Data – Theory and Applications,” Technical Report 304, Stanford University.
  • Ip, E. H. S. (2002), “On Single versus Multiple Imputation for a Class of Stochastic Algorithms Estimating Maximum Likelihood,” Computational Statistics, 17, 517–524. DOI: 10.1007/s001800200124.
  • Kingma, D. P., and Ba, J. (2015), “Adam: A Method for Stochastic Optimization,” in International Conference on Learning Representations (ICLR).
  • Kyzyurova, K. N., Berger, J. O., and Wolpert, R. L. (2018), “Coupling Computer Models through Linking their Statistical Emulators,” SIAM/ASA Journal on Uncertainty Quantification, 6, 1151–1171. DOI: 10.1137/17M1157702.
  • Ming, D., and Guillas, S. (2021), “Linked Gaussian Process Emulation for Systems of Computer Models using Matérn Kernels and Adaptive Design,” SIAM/ASA Journal on Uncertainty Quantification, 9, 1615–1642. DOI: 10.1137/20M1323771.
  • Montagna, S., and Tokdar, S. T. (2016), “Computer Emulation with Nonstationary Gaussian Processes,” SIAM/ASA Journal on Uncertainty Quantification, 4, 26–47. DOI: 10.1137/141001512.
  • Murray, I., Adams, R., and MacKay, D. (2010), “Elliptical Slice Sampling,” in Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings, pp. 541–548.
  • Nielsen, S. F. (2000), “The Stochastic EM Algorithm: Estimation and Asymptotic Results,” Bernoulli, 6, 457–489. DOI: 10.2307/3318671.
  • Nishihara, R., Murray, I., and Adams, R. P. (2014), “Parallel MCMC with Generalized Elliptical Slice Sampling,” The Journal of Machine Learning Research, 15, 2087–2112.
  • Paciorek, C., and Schervish, M. (2003), “Nonstationary Covariance Functions for Gaussian Process Regression,” Advances in Neural Information Processing Systems, 16, 273–280.
  • Radaideh, M. I., and Kozlowski, T. (2020), “Surrogate Modeling of Advanced Computer Simulations using Deep Gaussian Processes,” Reliability Engineering & System Safety, 195, 106731.
  • Rajaram, D., Puranik, T. G., Ashwin Renganathan, S., Sung, W., Fischer, O. P., Mavris, D. N., and Ramamurthy, A. (2020), “Empirical Assessment of Deep Gaussian Process Surrogate Models for Engineering Problems,” Journal of Aircraft, pp. 1–15.
  • Rasmussen, C., and Williams, C. (2005), Gaussian Processes for Machine Learning, Cambridge, MA: MIT Press.
  • Rouah, F. D. (2013), The Heston Model and Its Extensions in Matlab and C, Hoboken, NJ: Wiley.
  • Salimbeni, H., and Deisenroth, M. (2017), “Doubly Stochastic Variational Inference for Deep Gaussian Processes,” in Advances in Neural Information Processing Systems, pp. 4588–4599.
  • Salmanidou, D. M., Beck, J., and Guillas, S. (2021), “Probabilistic, High-Resolution Tsunami Predictions in North Cascadia by Exploiting Sequential Design for Efficient Emulation,” Natural Hazards and Earth System Sciences Discussions, 21, 3789–3807. DOI: 10.5194/nhess-21-3789-2021.
  • Sauer, A., Gramacy, R. B., and Higdon, D. (2022), “Active Learning for Deep Gaussian Process Surrogates,” Technometrics, 1–15. DOI: 10.1080/00401706.2021.2008505.
  • Snelson, E., and Ghahramani, Z. (2005), “Sparse Gaussian Processes using Pseudo-Inputs,” in Advances in Neural Information Processing Systems (Vol. 18), pp. 1257–1264.
  • Teng, L., Ehrhardt, M., and Günther, M. (2018), “Numerical Simulation of the Heston Model under Stochastic Correlation,” International Journal of Financial Studies, 6, 3. DOI: 10.3390/ijfs6010003.
  • Titsias, M., and Lawrence, N. D. (2010), “Bayesian Gaussian Process Latent Variable Model,” in Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Workshop and Conference Proceedings, pp. 844–851.
  • Volodina, V., and Williamson, D. (2020), “Diagnostics-Driven Nonstationary Emulators using Kernel Mixtures,” SIAM/ASA Journal on Uncertainty Quantification, 8, 1–26. DOI: 10.1137/19M124438X.
  • Wang, Y., Brubaker, M., Chaib-Draa, B., and Urtasun, R. (2016), “Sequential Inference for Deep Gaussian Process, in Artificial Intelligence and Statistics, pp. 694–703.
  • Wei, G. C., and Tanner, M. A. (1990), “A Monte Carlo Implementation of the EM Algorithm and the Poor Man’s Data Augmentation Algorithms,” Journal of the American Statistical Association, 85, 699–704. DOI: 10.1080/01621459.1990.10474930.
  • Zhang, S., Chen, Y., and Liu, Y. (2020), “An Improved Stochastic EM Algorithm for Large-Scale Full-Information Item Factor Analysis,” British Journal of Mathematical and Statistical Psychology, 73, 44–71. DOI: 10.1111/bmsp.12153.