7-8 sept. 2023 Fontainebleau (France)
Global sensitivity analysis of model outputs with dependent inputs: New insights around Shapley effects
Bertrand Iooss  1@  
1 : EDF R&D
EDF Recherche et Développement

In uncertainty quantification of numerical models (computer codes or machine learning models), the importance measures (or sensitivity indices) aim to quantify the influence of the model inputs on its outputs. For example, in environmental pollution impact calculation studies, sensitivity analysis allows to determine which physical parameters and environmental data have the most influence on the variability of the calculated pollutant concentration. Beyond the variance-based sensitivity indices (also known as the Sobol' indices) whose interpretation is restricted by a mutual independence assumption between the model inputs, the Shapley effects, based on cooperative game theory concepts, have recently aroused great interest among users eager for the interpretability of “black box” models. In this talk, we will relate recent works around the practical use of Shapley effects for global sensitivity analysis of model outputs and for interpretability of machine learning models. In particular, the statistical estimation issues will be discussed. Moreover, one potential undesirable effect of the particular allocation induces by Shapley effects will be highlighted, while proposing another allocation choice.

  • Poster
Personnes connectées : 1 Vie privée