Abstract
A growing number of industrial risk studies include probabilistic treatment of the numerous sources of uncertainties. In the uncertainty treatment framework considered in this article, the intrinsic input variability is modelled by a multivariate probability distribution, though only outputs of the physical model may be observed. The objective is to identify a probability distribution, the dispersion of which is independent of the sample size since intrinsic variability is at stake. In order to limit to a reasonable level, the number of (usually large CPU-consuming) physical model runs inside the inverse algorithms, and the linearized Gaussian framework is investigated within this article. First, a simple criterion is exhibited to ensure the identifiability of the model. Then, the inverse problem is re-interpreted as a statistical estimation problem with missing data structure. Hence, EM-type algorithms may be tested, and the expectation-conditional maximization either variant evidences advantages in overcoming the possible pathology of slow convergence which affects the standard EM algorithm. The estimation quality, as suggested through a ratio of epistemic uncertainty to intrinsic variability, proves to be closely linked to classical sensitivity indices. Numerical experiments on simulated and real data sets in nuclear thermal hydraulics highlight the good performances of these algorithms, provided an adequate parametrization with respect to identifiability.
Acknowledgements
The authors would like to thank Agnès de Crecy (CEA) and Franck Maurel (EDF/R&D) for their support, comments and suggestions.