274
Views
0
CrossRef citations to date
0
Altmetric
Dimension Reduction and Sparse Modeling

Generalized Variable Selection Algorithms for Gaussian Process Models by LASSO-Like Penalty

&
Pages 477-486 | Received 15 May 2022, Accepted 14 Aug 2023, Published online: 19 Oct 2023

References

  • Barber, R. F., and Candès, E. J. (2015), “Controlling the False Discovery Rate via Knockoffs,” The Annals of Statistics, 43, 2055–2085. DOI: 10.1214/15-AOS1337.
  • Bingham, E., Chen, J. P., Jankowiak, M., Obermeyer, F., Pradhan, N., Karaletsos, T., Singh, R., Szerlip, P., Horsfall, P., and Goodman, N. D. (2019), “Pyro: Deep Universal Probabilistic Programming,” The Journal of Machine Learning Research, 20, 973–978.
  • Blei, D. M., Kucukelbir, A., and McAuliffe, J. D. (2017), “Variational Inference: A Review for Statisticians,” Journal of the American Statistical Association, 112, 859–877. DOI: 10.1080/01621459.2017.1285773.
  • Dance, H., and Paige, B. (2022), “Fast and Scalable Spike and Slab Variable Selection in High-Dimensional Gaussian Processes,” in International Conference on Artificial Intelligence and Statistics, pp. 7976–8002. PMLR.
  • Gu, M. (2019), “Jointly Robust Prior for Gaussian Stochastic Process in Emulation, Calibration and Variable Selection,” Bayesian Analysis, 14, 857–885. DOI: 10.1214/18-BA1133.
  • Gu, M., Palomo, J., and Berger, J. O. (2019), “RobustGaSP: Robust Gaussian Stochastic Process Emulation in R,” The R Journal, 11, 112–136. DOI: 10.32614/RJ-2019-011.
  • Hensman, J., Fusi, N., and Lawrence, N. D. (2013), “Gaussian Processes for Big Data,” arXiv preprint arXiv:1309.6835.
  • Hensman, J., Matthews, A., and Ghahramani, Z. (2015), “Scalable Variational Gaussian Process Classification.” in Artificial Intelligence and Statistics, pp. 351–360, PMLR.
  • Hoffman, M. D., Blei, D. M., Wang, C., and Paisley, J. (2013), “Stochastic Variational Inference,” Journal of Machine Learning Research, 14, 1303–1347.
  • Hu, Y., and Allen, G. I. (2015), “Local-Aggregate Modeling for Big Data via Distributed Optimization: Applications to Neuroimaging,” Biometrics, 71, 905–917. DOI: 10.1111/biom.12355.
  • Jiang, S., and Tokdar, S. T. (2021), “Variable Selection Consistency of Gaussian Process Regression,” The Annals of Statistics, 49, 2491–2505. DOI: 10.1214/20-AOS2043.
  • Linkletter, C., Bingham, D., Hengartner, N., Higdon, D., and Ye, K. Q. (2006), “Variable Selection for Gaussian Process Models in Computer Experiments,” Technometrics, 48, 478–490. DOI: 10.1198/004017006000000228.
  • Mohammed, S., and Dey, D. K. (2020), “Classification of High-Dimensional Electroencephalography Data with Location Selection Using Structured Spike-and-Slab Prior,” Statistical Analysis and Data Mining: The ASA Data Science Journal, 13, 465–481. DOI: 10.1002/sam.11477.
  • Mohammed, S., Dey, D. K., and Zhang, Y. (2019), “Bayesian Variable Selection Using Spike-and-Slab Priors with Application to High Dimensional Electroencephalography Data by Local Modelling,” Journal of the Royal Statistical Society, Series C, 68, 1305–1326. DOI: 10.1111/rssc.12369.
  • Neal, R. M. (2012), Bayesian Learning for Neural Networks (Vol. 118), New York: Springer.
  • Niedermeyer, E., and da Silva, F. H. L. (2005), Electroencephalography: Basic Principles, Clinical Applications, and Related Fields, Philadelphia, PA: Lippincott Williams & Wilkins.
  • Oakley, J. E., and O’Hagan, A. (2004), “Probabilistic Sensitivity Analysis of Complex Models: A Bayesian Approach,” Journal of the Royal Statistical Society, Series B, 66, 751–769. DOI: 10.1111/j.1467-9868.2004.05304.x.
  • Park, T., and Casella, G. (2008), “The Bayesian LASSO,” Journal of the American Statistical Association, 103, 681–686. DOI: 10.1198/016214508000000337.
  • Rasmussen, C. E., and Williams, C. K. I. (2006), Gaussian Processes for Machine Learning, Adaptive Computation and Machine Learning, Cambridge, MA: MIT Press.
  • Savitsky, T., Vannucci, M., and Sha, N. (2011), “Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies,” Statistical Science, 26, 130–149. DOI: 10.1214/11-STS354.
  • Schonlau, M., and Welch, W. J. (2006), “Screening the Input Variables to a Computer Model via Analysis of Variance and Visualization,” in Screening, ed. A. Dean, and S. Lewis, pp. 308–327, New York: Springer.
  • Stein, M. L (2012), Interpolation of Spatial Data: Some Theory for Kriging, New York: Springer.
  • Tibshirani, R. (1996), “Regression Shrinkage and Selection via the Lasso,” Journal of the Royal Statistical Society, Series B, 58, 267–288. DOI: 10.1111/j.2517-6161.1996.tb02080.x.
  • Tran, D., Kucukelbir, A., Dieng, A. B., Rudolph, M., Liang, D., and Blei, D. M. (2016), “Edward: A Library for Probabilistic Modeling, Inference, and Criticism,” arXiv preprint arXiv:1610.09787 .
  • Williams, C. K. I., and Rasmussen, C. E. (1995), “Gaussian Processes for Regression,” in Proceedings of the 8th International Conference on Neural Information Processing Systems, pp. 514–520.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.