Abstract
We investigate uncertainty propagation in the context of high-end complex simulation codes, whose runtime on one configuration is on the order of the total limit of computational resources. To this end, we study the use of lower-fidelity data generated by proper orthogonal decomposition-based model reduction. A Gaussian process approach is used to model the difference between the higher-fidelity and the lower-fidelity data. The approach circumvents the extensive sampling of model outputs – impossible in our context – by substituting abundant, lower-fidelity data in place of high-fidelity data. This enables uncertainty analysis while accounting for the reduction in information caused by the model reduction. We test the approach on Navier–Stokes flow models: first on a simplified code and then using the scalable high-fidelity fluid mechanics solver Nek5000. We demonstrate that the approach can give reasonably accurate while conservative error estimates of important statistics including high quantiles of the drag coefficient.
We thank our colleagues Paul Fischer, Elia Merzari, Alex Obabko and Zhu Wang for their assistance and input. \noindent The submitted manuscript has been created by the University of Chicago as Operator of Argonne National Laboratory (‘Argonne’) under Contract No. DE-AC02-06CH11357 with the US Department of Energy. The US Government retains for itself, and others acting on its behalf, a paid-up, non-exclusive, irrevocable worldwide license in said article to reproduce, prepare derivative works, distribute copies to the public, and perform publicly and display publicly, by or on behalf of the Government.