Bayesian Model Averaging for Causality Estimation and its Approximation
based on Gaussian Scale Mixture Distributions
- URL: http://arxiv.org/abs/2103.08195v1
- Date: Mon, 15 Mar 2021 08:07:58 GMT
- Title: Bayesian Model Averaging for Causality Estimation and its Approximation
based on Gaussian Scale Mixture Distributions
- Authors: Shunsuke Horii
- Abstract summary: We first show from a Bayesian perspective that it is Bayes optimal to weight (average) the causal effects estimated under each model.
We develop an approximation to the Bayes optimal estimator by using Gaussian scale mixture distributions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the estimation of the causal effect under linear Structural Causal Models
(SCMs), it is common practice to first identify the causal structure, estimate
the probability distributions, and then calculate the causal effect. However,
if the goal is to estimate the causal effect, it is not necessary to fix a
single causal structure or probability distributions. In this paper, we first
show from a Bayesian perspective that it is Bayes optimal to weight (average)
the causal effects estimated under each model rather than estimating the causal
effect under a fixed single model. This idea is also known as Bayesian model
averaging. Although the Bayesian model averaging is optimal, as the number of
candidate models increases, the weighting calculations become computationally
hard. We develop an approximation to the Bayes optimal estimator by using
Gaussian scale mixture distributions.
Related papers
- Robust Estimation of Causal Heteroscedastic Noise Models [7.568978862189266]
Student's $t$-distribution is known for its robustness in accounting for sampling variability with smaller sample sizes and extreme values without significantly altering the overall distribution shape.
Our empirical evaluations demonstrate that our estimators are more robust and achieve better overall performance across synthetic and real benchmarks.
arXiv Detail & Related papers (2023-12-15T02:26:35Z) - A Mean Field Approach to Empirical Bayes Estimation in High-dimensional
Linear Regression [8.345523969593492]
We study empirical Bayes estimation in high-dimensional linear regression.
We adopt a variational empirical Bayes approach, introduced originally in Carbonetto and Stephens (2012) and Kim et al. (2022).
This provides the first rigorous empirical Bayes method in a high-dimensional regression setting without sparsity.
arXiv Detail & Related papers (2023-09-28T20:51:40Z) - Learned harmonic mean estimation of the marginal likelihood with
normalizing flows [6.219412541001482]
We introduce the use of normalizing flows to represent the importance sampling target distribution.
The code implementing the learned harmonic mean, which is publicly available, has been updated to now support normalizing flows.
arXiv Detail & Related papers (2023-06-30T18:00:02Z) - Bivariate Causal Discovery using Bayesian Model Selection [11.726586969589]
We show how to incorporate causal assumptions within the Bayesian framework.
This enables us to construct models with realistic assumptions.
We then outperform previous methods on a wide range of benchmark datasets.
arXiv Detail & Related papers (2023-06-05T14:51:05Z) - Robust Gaussian Process Regression with Huber Likelihood [2.7184224088243365]
We propose a robust process model in the Gaussian process framework with the likelihood of observed data expressed as the Huber probability distribution.
The proposed model employs weights based on projection statistics to scale residuals and bound the influence of vertical outliers and bad leverage points on the latent functions estimates.
arXiv Detail & Related papers (2023-01-19T02:59:33Z) - Estimation of Bivariate Structural Causal Models by Variational Gaussian
Process Regression Under Likelihoods Parametrised by Normalising Flows [74.85071867225533]
Causal mechanisms can be described by structural causal models.
One major drawback of state-of-the-art artificial intelligence is its lack of explainability.
arXiv Detail & Related papers (2021-09-06T14:52:58Z) - Evaluating State-of-the-Art Classification Models Against Bayes
Optimality [106.50867011164584]
We show that we can compute the exact Bayes error of generative models learned using normalizing flows.
We use our approach to conduct a thorough investigation of state-of-the-art classification models.
arXiv Detail & Related papers (2021-06-07T06:21:20Z) - Causal Expectation-Maximisation [70.45873402967297]
We show that causal inference is NP-hard even in models characterised by polytree-shaped graphs.
We introduce the causal EM algorithm to reconstruct the uncertainty about the latent variables from data about categorical manifest variables.
We argue that there appears to be an unnoticed limitation to the trending idea that counterfactual bounds can often be computed without knowledge of the structural equations.
arXiv Detail & Related papers (2020-11-04T10:25:13Z) - The Vector Poisson Channel: On the Linearity of the Conditional Mean
Estimator [82.5577471797883]
This work studies properties of the conditional mean estimator in vector Poisson noise.
The first result shows that the conditional mean estimator cannot be linear when the dark current parameter of the Poisson noise is non-zero.
The second result produces a quantitative refinement of the first result.
arXiv Detail & Related papers (2020-03-19T18:21:33Z) - Decision-Making with Auto-Encoding Variational Bayes [71.44735417472043]
We show that a posterior approximation distinct from the variational distribution should be used for making decisions.
Motivated by these theoretical results, we propose learning several approximate proposals for the best model.
In addition to toy examples, we present a full-fledged case study of single-cell RNA sequencing.
arXiv Detail & Related papers (2020-02-17T19:23:36Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.