812
Views
6
CrossRef citations to date
0
Altmetric
Dimension Reduction and Prediction

Sequential Learning of Active Subspaces

, ORCID Icon & ORCID Icon
Pages 1224-1237 | Received 05 Sep 2019, Accepted 18 Dec 2020, Published online: 08 Mar 2021

References

  • Barnett, S. (1979), Matrix Methods for Engineers and Scientists, New York: McGraw-Hill.
  • Bellman, R. E. (2003), Dynamic Programming, New York: Dover Publications.
  • Binois, M., Huang, J., Gramacy, R. B., and Ludkovski, M. (2019), “Replication or Exploration? Sequential Design for Stochastic Simulation Experiments,” Technometrics, 61, 7–23. DOI: 10.1080/00401706.2018.1469433.
  • Constantine, P. G. (2015), Active Subspaces, Philadelphia, PA: SIAM.
  • Constantine, P. G., Dow, E., and Wang, Q. (2014), “Active Subspace Methods in Theory and Practice: Applications to Kriging Surfaces,” SIAM Journal on Scientific Computing, 36, A1500–A1524. DOI: 10.1137/130916138.
  • Constantine, P. G., Eftekhari, A., Hokanson, J., and Ward, R. A. (2017), “A Near-Stationary Subspace for Ridge Approximation,” Computer Methods in Applied Mechanics and Engineering, 326, 402–421. DOI: 10.1016/j.cma.2017.07.038.
  • Constantine, P. G., Eftekhari, A., and Wakin, M. B. (2015), “Computing Active Subspaces Efficiently With Gradient Sketching,” in 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), IEEE, pp. 353–356. DOI: 10.1109/CAMSAP.2015.7383809.
  • De Lozzo, M., and Marrel, A. (2016), “Estimation of the Derivative-Based Global Sensitivity Measures Using a Gaussian Process Metamodel,” SIAM/ASA Journal on Uncertainty Quantification, 4, 708–738. DOI: 10.1137/15M1013377.
  • Djolonga, J., Krause, A., and Cevher, V. (2013), “High-Dimensional Gaussian Process Bandits,” in Advances in Neural Information Processing Systems (Vol. 26), pp. 1025–1033.
  • Durrande, N., Ginsbourger, D., and Roustant, O. (2012), “Additive Kernels for Gaussian Process Modeling,” Annales de la Facultée de Sciences de Toulouse, 17, 481–499.
  • Durrande, N., Ginsbourger, D., Roustant, O., and Carraro, L. (2013), “ANOVA Kernels and RKHS of Zero Mean Functions for Model-Based Sensitivity Analysis,” Journal of Multivariate Analysis, 115, 57–67. DOI: 10.1016/j.jmva.2012.08.016.
  • Duvenaud, D. K., Nickisch, H., and Rasmussen, C. E. (2011), “Additive Gaussian Processes,” in Advances in Neural Information Processing Systems (Vol. 24), eds. J. Shawe-Taylor, R. S. Zemel, P. L. Bartlett, F. Pereira, and K. Q. Weinberger, Curran Associates, Inc., pp. 226–234.
  • Efron, B. (1981), “Nonparametric Estimates of Standard Error: The Jackknife, the Bootstrap and Other Methods,” Biometrika, 68, 589–599. DOI: 10.1093/biomet/68.3.589.
  • Enns, E. A., Kirkeide, M., Mehta, A., MacLehose, R., Knowlton, G. S., Smith, M. K., Searle, K. M., Zhao, R., Gildemeister, S., Simon, A., Sanstead, E., and Kulasingam, S. (2020), “Modeling the Impact of Social Distancing Measures on the Spread of SARS-CoV-2 in Minnesota,” Tech. Rep. 1148–427724, University of Minnesota.
  • Forrester, A. I. J., Sobester, A., and Keane, A. J. (2008), Engineering Design via Surrogate Modelling—A Practical Guide, New York: Wiley.
  • Fort, J.-C., Klein, T., and Rachdi, N. (2016), “New Sensitivity Analysis Subordinated to a Contrast,” Communications in Statistics—Theory and Methods, 45, 4349–4364. DOI: 10.1080/03610926.2014.901369.
  • Fréchet, M. R. (1948), “Les Éléments Aléatoires de Nature Quelconque dans un Espace Distancié,” Annales de l’Institut Henri Poincaré, 10, 215–310.
  • Friedman, J. H., and Stuetzle, W. (1981), “Projection Pursuit Regression,” Journal of the American Statistical Association, 76, 817–823. DOI: 10.1080/01621459.1981.10477729.
  • Fukumizu, K., and Leng, C. (2014), “Gradient-Based Kernel Dimension Reduction for Regression,” Journal of the American Statistical Association, 109, 359–370. DOI: 10.1080/01621459.2013.838167.
  • Garnett, R., Osborne, M. A., and Hennig, P. (2014), “Active Learning of Linear Embeddings for Gaussian Processes,” in Proceedings of the Thirtieth Conference on Uncertainty in Artificial Intelligence, UAI’14, AUAI Press, pp. 230–239.
  • Ghosh, S. (2018), Kernel Smoothing: Principles, Methods and Applications, Hoboken, NJ: Wiley.
  • Glaws, A., Constantine, P. G., and Cook, R. D. (2020), “Inverse Regression for Ridge Recovery: A Data-Driven Approach for Parameter Reduction in Computer Experiments,” Statistics and Computing, 30, 237–253. DOI: 10.1007/s11222-019-09876-y.
  • Hokanson, J., and Constantine, P. G. (2018), “Data-Driven Polynomial Ridge Approximation Using Variable Projection,” SIAM Journal on Scientific Computing, 40, A1566–A1589. DOI: 10.1137/17M1117690.
  • Holodnak, J. T., Ipsen, I. C., and Smith, R. C. (2018), “A Probabilistic Subspace Bound With Application to Active Subspaces,” SIAM Journal on Matrix Analysis and Applications, 39, 1208–1220. DOI: 10.1137/17M1141503.
  • Iooss, B., and Lemaître, P. (2015), “A Review on Global Sensitivity Analysis Methods,” in Uncertainty Management in Simulation-Optimization of Complex Systems: Algorithms and Applications, eds. C. Meloni and G. Dellino, Springer, pp. 101–122.
  • Ji-Guang, S. (1987), “Perturbation of Angles Between Linear Subspaces,” Journal of Computational Mathematics, 5, 58–61.
  • Labopin-Richard, T., and Picheny, V. (2018), “Sequential Design of Experiments for Estimating Percentiles of Black-Box Functions,” Statistica Sinica, 28, 853–877.
  • Larson, J., Menickelly, M., and Wild, S. M. (2019), “Derivative-Free Optimization Methods,” Acta Numerica, 28, 287–404. DOI: 10.1017/S0962492919000060.
  • Lee, M. R. (2019), “Modified Active Subspaces Using the Average of Gradients,” SIAM/ASA Journal on Uncertainty Quantification, 7, 53–66. DOI: 10.1137/17M1140662.
  • Li, K.-C. (1991), “Sliced Inverse Regression for Dimension Reduction,” Journal of the American Statistical Association, 86, 316–327. DOI: 10.1080/01621459.1991.10475035.
  • Li, K.-C. (1992), “On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein’s Lemma,” Journal of the American Statistical Association, 87, 1025–1039. DOI: 10.1080/01621459.1992.10476258.
  • Li, W., Lin, G., and Li, B. (2016), “Inverse Regression-Based Uncertainty Quantification Algorithms for High-Dimensional Models: Theory and Practice,” Journal of Computational Physics, 321, 259–278. DOI: 10.1016/j.jcp.2016.05.040.
  • Ma, Y., and Zhu, L. (2013), “A Review on Dimension Reduction,” International Statistical Review, 81, 134–150. DOI: 10.1111/j.1751-5823.2012.00182.x.
  • Marcy, P. W. (2017), “Bayesian Gaussian Process Models on Spaces of Sufficient Dimension Reduction,” Statistical Perspectives of Uncertainty Quantification Workshop.
  • Morales, J. L., and Nocedal, J. (2011), “Remark on ‘Algorithm 778: L-BFGS-B: Fortran Subroutines for Large-Scale Bound Constrained Optimization’,” ACM Transactions on Mathematical Software, 38, 1–4. DOI: 10.1145/2049662.2049669.
  • Moré, J. J., and Wild, S. M. (2012), “Estimating Derivatives of Noisy Simulations,” ACM Transactions on Mathematical Software, 38, 19. DOI: 10.1145/2168773.2168777.
  • Namura, N., Shimoyama, K., and Obayashi, S. (2017), “Kriging Surrogate Model With Coordinate Transformation Based on Likelihood and Gradient,” Journal of Global Optimization, 68, 827–849. DOI: 10.1007/s10898-017-0516-y.
  • Othmer, C., Lukaczyk, T. W., Constantine, P., and Alonso, J. J. (2016), “On Active Subspaces in Car Aerodynamics,” in 17th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, American Institute of Aeronautics and Astronautics. DOI: 10.2514/6.2016-4294.
  • Palar, P. S., and Shimoyama, K. (2017), “Exploiting Active Subspaces in Global Optimization: How Complex Is Your Problem?,” in Proceedings of the Genetic and Evolutionary Computation Conference Companion on GECCO ’17, ACM Press, pp. 1487–1494.
  • Palar, P. S., and Shimoyama, K. (2018), “On the Accuracy of Kriging Model in Active Subspaces,” in 2018 AIAA/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, p. 0913.
  • Petersen, K. B., and Pedersen, M. S. (2008), “The Matrix Cookbook,” Technical University of Denmark, 7, 15.
  • Rasmussen, C. E., and Williams, C. (2006), Gaussian Processes for Machine Learning, Cambridge, MA: MIT Press.
  • Salem, M. B., Bachoc, F., Roustant, O., Gamboa, F., and Tomaso, L. (2019), “Sequential Dimension Reduction for Learning Features of Expensive Black-Box Functions,” Preprint 01688329v2, HAL.
  • Samarov, A. M. (1993), “Exploring Regression Structure Using Nonparametric Functional Estimation,” Journal of the American Statistical Association, 88, 836–847. DOI: 10.1080/01621459.1993.10476348.
  • Scholkopf, B., and Smola, A. J. (2001), Learning With Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, Cambridge, MA: MIT Press.
  • Sobol, I. (2001), “Global Sensitivity Indices for Nonlinear Mathematical Models and Their Monte Carlo Estimates,” Mathematics and Computers in Simulation, 55, 271–280. The Second IMACS Seminar on Monte Carlo Methods.
  • Sung, C.-L., Wang, W., Plumlee, M., and Haaland, B. (2019), “Multiresolution Functional ANOVA for Large-Scale, Many-Input Computer Experiments,” Journal of the American Statistical Association, 115, 908– 919. DOI: 10.1080/01621459.2019.1595630.
  • Titsias, M., and Lawrence, N. D. (2010), “Bayesian Gaussian Process Latent Variable Model,” in Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, pp. 844–851.
  • Tripathy, R., Bilionis, I., and Gonzalez, M. (2016), “Gaussian Processes With Built-In Dimensionality Reduction: Applications to High-Dimensional Uncertainty Propagation,” Journal of Computational Physics, 321, 191–223. DOI: 10.1016/j.jcp.2016.05.039.
  • Viswanath, A., Forrester, A. I. J., and Keane, A. (2011), “Dimension Reduction for Aerodynamic Design Optimization,” AIAA Journal, 49, 1256–1266. DOI: 10.2514/1.J050717.
  • Vivarelli, F., and Williams, C. K. I. (1999), “Discovering Hidden Features With Gaussian Processes Regression,” in Proceedings of the 1998 Conference on Advances in Neural Information Processing Systems II, MIT Press, pp. 613–619.
  • Wang, Z., Hutter, F., Zoghi, M., Matheson, D., and De Freitas, N. (2016), “Bayesian Optimization in a Billion Dimensions via Random Embeddings,” Journal of Artificial Intelligence Research, 55, 361–387. DOI: 10.1613/jair.4806.
  • Wolfram Research, Inc. (2019), “Mathematica, Version 12.0,” Champaign, IL.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.