Sensitivity Analysis of High-Dimensional Models with Correlated Inputs
- URL: http://arxiv.org/abs/2306.00555v1
- Date: Wed, 31 May 2023 14:48:54 GMT
- Title: Sensitivity Analysis of High-Dimensional Models with Correlated Inputs
- Authors: Juraj Kardos, Wouter Edeling, Diana Suleimenova, Derek Groen, Olaf
Schenk
- Abstract summary: The sensitivity of correlated parameters can not only differ in magnitude, but even the sign of the derivative-based index can be inverted.
We demonstrate that the sensitivity of the correlated parameters can not only differ in magnitude, but even the sign of the derivative-based index can be inverted.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Sensitivity analysis is an important tool used in many domains of
computational science to either gain insight into the mathematical model and
interaction of its parameters or study the uncertainty propagation through the
input-output interactions. In many applications, the inputs are stochastically
dependent, which violates one of the essential assumptions in the
state-of-the-art sensitivity analysis methods. Consequently, the results
obtained ignoring the correlations provide values which do not reflect the true
contributions of the input parameters. This study proposes an approach to
address the parameter correlations using a polynomial chaos expansion method
and Rosenblatt and Cholesky transformations to reflect the parameter
dependencies. Treatment of the correlated variables is discussed in context of
variance and derivative-based sensitivity analysis. We demonstrate that the
sensitivity of the correlated parameters can not only differ in magnitude, but
even the sign of the derivative-based index can be inverted, thus significantly
altering the model behavior compared to the prediction of the analysis
disregarding the correlations. Numerous experiments are conducted using
workflow automation tools within the VECMA toolkit.
Related papers
- A new paradigm for global sensitivity analysis [0.0]
Current theory of global sensitivity analysis is limited in scope-for instance, the analysis is limited to the output's variance.
It is shown that these important problems are solved all at once by adopting a new paradigm.
arXiv Detail & Related papers (2024-09-10T07:20:51Z) - Variance-based sensitivity analysis in the presence of correlated input variables [0.0]
We propose an extension of the classical Sobol' estimator for the estimation of variance based sensitivity indices.
The approach assumes a linear correlation model which is used to decompose the contribution of an input variable into a correlated and an uncorrelated part.
arXiv Detail & Related papers (2024-08-09T08:32:58Z) - Contrastive Factor Analysis [70.02770079785559]
This paper introduces a novel Contrastive Factor Analysis framework.
It aims to leverage factor analysis's advantageous properties within the realm of contrastive learning.
To further leverage the interpretability properties of non-negative factor analysis, it is extended to a non-negative version.
arXiv Detail & Related papers (2024-07-31T16:52:00Z) - Determining the significance and relative importance of parameters of a
simulated quenching algorithm using statistical tools [0.0]
In this paper the ANOVA (ANalysis Of the VAriance) method is used to carry out an exhaustive analysis of a simulated annealing based method.
The significance and relative importance of the parameters regarding the obtained results, as well as suitable values for each of these, were obtained using ANOVA and post-hoc Tukey HSD test.
arXiv Detail & Related papers (2024-02-08T16:34:00Z) - Challenges in Variable Importance Ranking Under Correlation [6.718144470265263]
We present a comprehensive simulation study investigating the impact of feature correlation on the assessment of variable importance.
While there is always no correlation between knockoff variables and its corresponding predictor variables, we prove that the correlation increases linearly beyond a certain correlation threshold between the predictor variables.
arXiv Detail & Related papers (2024-02-05T19:02:13Z) - Towards stable real-world equation discovery with assessing
differentiating quality influence [52.2980614912553]
We propose alternatives to the commonly used finite differences-based method.
We evaluate these methods in terms of applicability to problems, similar to the real ones, and their ability to ensure the convergence of equation discovery algorithms.
arXiv Detail & Related papers (2023-11-09T23:32:06Z) - Identifiable Latent Polynomial Causal Models Through the Lens of Change [82.14087963690561]
Causal representation learning aims to unveil latent high-level causal representations from observed low-level data.
One of its primary tasks is to provide reliable assurance of identifying these latent causal models, known as identifiability.
arXiv Detail & Related papers (2023-10-24T07:46:10Z) - Identifying Weight-Variant Latent Causal Models [82.14087963690561]
We find that transitivity acts as a key role in impeding the identifiability of latent causal representations.
Under some mild assumptions, we can show that the latent causal representations can be identified up to trivial permutation and scaling.
We propose a novel method, termed Structural caUsAl Variational autoEncoder, which directly learns latent causal representations and causal relationships among them.
arXiv Detail & Related papers (2022-08-30T11:12:59Z) - Data-Driven Influence Functions for Optimization-Based Causal Inference [105.5385525290466]
We study a constructive algorithm that approximates Gateaux derivatives for statistical functionals by finite differencing.
We study the case where probability distributions are not known a priori but need to be estimated from data.
arXiv Detail & Related papers (2022-08-29T16:16:22Z) - Evaluating Sensitivity to the Stick-Breaking Prior in Bayesian
Nonparametrics [85.31247588089686]
We show that variational Bayesian methods can yield sensitivities with respect to parametric and nonparametric aspects of Bayesian models.
We provide both theoretical and empirical support for our variational approach to Bayesian sensitivity analysis.
arXiv Detail & Related papers (2021-07-08T03:40:18Z) - Numerical Solution of the Parametric Diffusion Equation by Deep Neural
Networks [2.2731658205414025]
We study the machine-learning-based solution of parametric partial differential equations.
We find strong support for the hypothesis that approximation-theoretical effects heavily influence the practical behavior of learning problems in numerical analysis.
arXiv Detail & Related papers (2020-04-25T12:48:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.