Causal Bayesian Optimization via Exogenous Distribution Learning
- URL: http://arxiv.org/abs/2402.02277v9
- Date: Wed, 25 Dec 2024 14:34:22 GMT
- Title: Causal Bayesian Optimization via Exogenous Distribution Learning
- Authors: Shaogang Ren, Xiaoning Qian,
- Abstract summary: Existing Causal Bayesian Optimization(CBO) methods rely on hard interventions that alter the causal structure to maximize the reward.
We develop a new CBO method by leveraging the learned endogenous distribution.
Experiments on different datasets and applications show the benefits of our proposed method.
- Score: 15.8362578568708
- License:
- Abstract: Maximizing a target variable as an operational objective in a structural causal model is an important problem. Existing Causal Bayesian Optimization~(CBO) methods either rely on hard interventions that alter the causal structure to maximize the reward; or introduce action nodes to endogenous variables so that the data generation mechanisms are adjusted to achieve the objective. In this paper, a novel method is introduced to learn the distribution of exogenous variables, which is typically ignored or marginalized through expectation by existing methods. Exogenous distribution learning improves the approximation accuracy of structural causal models in a surrogate model that is usually trained with limited observational data. Moreover, the learned exogenous distribution extends existing CBO to general causal schemes beyond Additive Noise Models~(ANM). The recovery of exogenous variables allows us to use a more flexible prior for noise or unobserved hidden variables. We develop a new CBO method by leveraging the learned exogenous distribution. Experiments on different datasets and applications show the benefits of our proposed method.
Related papers
- An AI-powered Bayesian generative modeling approach for causal inference in observational studies [4.624176903641013]
CausalBGM is an AI-powered Bayesian generative modeling approach.
It estimates the individual treatment effect (ITE) by learning individual-specific distributions of a low-dimensional latent feature set.
arXiv Detail & Related papers (2025-01-01T06:52:45Z) - Learning Structural Causal Models from Ordering: Identifiable Flow Models [19.99352354910655]
We introduce a set of flow models that can recover component-wise, invertible transformation of variables.
We propose design improvements that enable simultaneous learning of all causal mechanisms.
Our method achieves a significant reduction in computational time compared to existing diffusion-based techniques.
arXiv Detail & Related papers (2024-12-13T04:25:56Z) - Differentiable Causal Discovery For Latent Hierarchical Causal Models [19.373348700715578]
We present new theoretical results on the identifiability of nonlinear latent hierarchical causal models.
We develop a novel differentiable causal discovery algorithm that efficiently estimates the structure of such models.
arXiv Detail & Related papers (2024-11-29T09:08:20Z) - Influence Functions for Scalable Data Attribution in Diffusion Models [52.92223039302037]
Diffusion models have led to significant advancements in generative modelling.
Yet their widespread adoption poses challenges regarding data attribution and interpretability.
We develop an influence functions framework to address these challenges.
arXiv Detail & Related papers (2024-10-17T17:59:02Z) - Targeted Cause Discovery with Data-Driven Learning [66.86881771339145]
We propose a novel machine learning approach for inferring causal variables of a target variable from observations.
We employ a neural network trained to identify causality through supervised learning on simulated data.
Empirical results demonstrate the effectiveness of our method in identifying causal relationships within large-scale gene regulatory networks.
arXiv Detail & Related papers (2024-08-29T02:21:11Z) - Neural Flow Diffusion Models: Learnable Forward Process for Improved Diffusion Modelling [2.1779479916071067]
We introduce a novel framework that enhances diffusion models by supporting a broader range of forward processes.
We also propose a novel parameterization technique for learning the forward process.
Results underscore NFDM's versatility and its potential for a wide range of applications.
arXiv Detail & Related papers (2024-04-19T15:10:54Z) - Bayesian learning of Causal Structure and Mechanisms with GFlowNets and Variational Bayes [51.84122462615402]
We introduce a novel method to learn the structure and mechanisms of the causal model using Variational Bayes-DAG-GFlowNet.
We extend the method of Bayesian causal structure learning using GFlowNets to learn the parameters of a linear-Gaussian model.
arXiv Detail & Related papers (2022-11-04T21:57:39Z) - ER: Equivariance Regularizer for Knowledge Graph Completion [107.51609402963072]
We propose a new regularizer, namely, Equivariance Regularizer (ER)
ER can enhance the generalization ability of the model by employing the semantic equivariance between the head and tail entities.
The experimental results indicate a clear and substantial improvement over the state-of-the-art relation prediction methods.
arXiv Detail & Related papers (2022-06-24T08:18:05Z) - Variational Causal Networks: Approximate Bayesian Inference over Causal
Structures [132.74509389517203]
We introduce a parametric variational family modelled by an autoregressive distribution over the space of discrete DAGs.
In experiments, we demonstrate that the proposed variational posterior is able to provide a good approximation of the true posterior.
arXiv Detail & Related papers (2021-06-14T17:52:49Z) - Latent Causal Invariant Model [128.7508609492542]
Current supervised learning can learn spurious correlation during the data-fitting process.
We propose a Latent Causal Invariance Model (LaCIM) which pursues causal prediction.
arXiv Detail & Related papers (2020-11-04T10:00:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.