Meta-Dependence in Conditional Independence Testing
- URL: http://arxiv.org/abs/2504.12594v1
- Date: Thu, 17 Apr 2025 02:41:22 GMT
- Title: Meta-Dependence in Conditional Independence Testing
- Authors: Bijan Mazaheri, Jiaqi Zhang, Caroline Uhler,
- Abstract summary: We study a "meta-dependence" between conditional independence properties using the following geometric intuition.<n>We provide a simple-to-compute measure of this meta-dependence using information projections and consolidate our findings empirically using both synthetic and real-world data.
- Score: 11.302018782958205
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Constraint-based causal discovery algorithms utilize many statistical tests for conditional independence to uncover networks of causal dependencies. These approaches to causal discovery rely on an assumed correspondence between the graphical properties of a causal structure and the conditional independence properties of observed variables, known as the causal Markov condition and faithfulness. Finite data yields an empirical distribution that is "close" to the actual distribution. Across these many possible empirical distributions, the correspondence to the graphical properties can break down for different conditional independencies, and multiple violations can occur at the same time. We study this "meta-dependence" between conditional independence properties using the following geometric intuition: each conditional independence property constrains the space of possible joint distributions to a manifold. The "meta-dependence" between conditional independences is informed by the position of these manifolds relative to the true probability distribution. We provide a simple-to-compute measure of this meta-dependence using information projections and consolidate our findings empirically using both synthetic and real-world data.
Related papers
- Conditional Dependence via U-Statistics Pruning [11.552000005640203]
This paper presents a novel measure of conditional dependence based on the use of incomplete unbiased statistics of degree two.<n>The proposed approach is articulated as an extension of the Hilbert-Schmidt independence criterion, which becomes expressible through kernels that operate on 4-tuples of data.
arXiv Detail & Related papers (2024-10-21T11:06:09Z) - On Discovery of Local Independence over Continuous Variables via Neural Contextual Decomposition [26.34622544479565]
We define and characterize the local independence relationship that holds in a specific set of joint assignments of parental variables.
We propose a novel method, coined neural contextual decomposition (NCD), which learns such partition by imposing each set to induce CSSI.
arXiv Detail & Related papers (2024-05-12T08:48:37Z) - Federated Causal Discovery from Heterogeneous Data [70.31070224690399]
We propose a novel FCD method attempting to accommodate arbitrary causal models and heterogeneous data.
These approaches involve constructing summary statistics as a proxy of the raw data to protect data privacy.
We conduct extensive experiments on synthetic and real datasets to show the efficacy of our method.
arXiv Detail & Related papers (2024-02-20T18:53:53Z) - Interventional Causal Representation Learning [75.18055152115586]
Causal representation learning seeks to extract high-level latent factors from low-level sensory data.
Can interventional data facilitate causal representation learning?
We show that interventional data often carries geometric signatures of the latent factors' support.
arXiv Detail & Related papers (2022-09-24T04:59:03Z) - Non-Parametric Inference of Relational Dependence [17.76905154531867]
This work examines the problem of estimating independence in data drawn from relational systems.
We propose a consistent, non-parametric, scalable kernel test to operationalize the relational independence test for non-i.i.d. observational data.
arXiv Detail & Related papers (2022-06-30T03:42:20Z) - Nonparametric Conditional Local Independence Testing [69.31200003384122]
Conditional local independence is an independence relation among continuous time processes.
No nonparametric test of conditional local independence has been available.
We propose such a nonparametric test based on double machine learning.
arXiv Detail & Related papers (2022-03-25T10:31:02Z) - Exploiting Independent Instruments: Identification and Distribution
Generalization [3.701112941066256]
We exploit the independence for distribution generalization by taking into account higher moments.
We prove that the proposed estimator is invariant to distributional shifts on the instruments.
These results hold even in the under-identified case where the instruments are not sufficiently rich to identify the causal function.
arXiv Detail & Related papers (2022-02-03T21:49:04Z) - Transitional Conditional Independence [0.0]
We introduce transition probability spaces and transitional random variables.
These constructions will generalize, strengthen and previous notions of (conditional) random variables and non-stochastic variables.
arXiv Detail & Related papers (2021-04-23T11:52:15Z) - Disentangling Observed Causal Effects from Latent Confounders using
Method of Moments [67.27068846108047]
We provide guarantees on identifiability and learnability under mild assumptions.
We develop efficient algorithms based on coupled tensor decomposition with linear constraints to obtain scalable and guaranteed solutions.
arXiv Detail & Related papers (2021-01-17T07:48:45Z) - On Disentangled Representations Learned From Correlated Data [59.41587388303554]
We bridge the gap to real-world scenarios by analyzing the behavior of the most prominent disentanglement approaches on correlated data.
We show that systematically induced correlations in the dataset are being learned and reflected in the latent representations.
We also demonstrate how to resolve these latent correlations, either using weak supervision during training or by post-hoc correcting a pre-trained model with a small number of labels.
arXiv Detail & Related papers (2020-06-14T12:47:34Z) - CausalVAE: Structured Causal Disentanglement in Variational Autoencoder [52.139696854386976]
The framework of variational autoencoder (VAE) is commonly used to disentangle independent factors from observations.
We propose a new VAE based framework named CausalVAE, which includes a Causal Layer to transform independent factors into causal endogenous ones.
Results show that the causal representations learned by CausalVAE are semantically interpretable, and their causal relationship as a Directed Acyclic Graph (DAG) is identified with good accuracy.
arXiv Detail & Related papers (2020-04-18T20:09:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.