ABSTRACT
In recent years, active subspace methods (ASMs) have become a popular means of performing subspace sensitivity analysis on black-box functions. Naively applied, however, ASMs require gradient evaluations of the target function. In the event of noisy, expensive, or stochastic simulators, evaluating gradients via finite differencing may be infeasible. In such cases, often a surrogate model is employed, on which finite differencing is performed. When the surrogate model is a Gaussian process (GP), we show that the ASM estimator is available in closed form, rendering the finite-difference approximation unnecessary. We use our closed-form solution to develop acquisition functions focused on sequential learning tailored to sensitivity analysis on top of ASMs. We also show that the traditional ASM estimator may be viewed as a method of moments estimator for a certain class of GPs. We demonstrate how uncertainty on GP hyperparameters may be propagated to uncertainty on the sensitivity analysis, allowing model-based confidence intervals on the active subspace. Our methodological developments are illustrated on several examples. Supplementary files for this article are available online.
Supplementary Materials
Additional kernel expressions and derivation:
Detailed update derivations and kernel expressions for Matérn 3/2 and 5/2 kernels as well as gradients for all kernel expressions. (PDF)
R-package for sequential active subspace UQ:
R-package activegp containing code implementing methods described in this article (also available from CRAN). (GNU Tar file).
Acknowledgments
The authors would like to thank Robert B. Gramacy for thoughtful comments on early drafts. This article benefited greatly from feedback provided by two anonymous referees.