ExDAG: Exact learning of DAGs
- URL: http://arxiv.org/abs/2406.15229v1
- Date: Fri, 21 Jun 2024 15:15:38 GMT
- Title: ExDAG: Exact learning of DAGs
- Authors: Pavel Rytíř, Aleš Wodecki, Jakub Mareček,
- Abstract summary: We provide a novel mixed-integer quadratic programming formulation and associated algorithm that identifies DAGs on up to 50 vertices, where these are identifiable.
Although there is a superexponential number of constraints that prevent the formation of cycles, the algorithm adds constraints violated by solutions found, rather than imposing all constraints in each continuous-valued relaxation.
Our empirical results show that ExDAG outperforms local state-of-the-art solvers in terms of precision and outperforms state-of-the-art global solvers with respect to scaling.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: There has been a growing interest in causal learning in recent years. Commonly used representations of causal structures, including Bayesian networks and structural equation models (SEM), take the form of directed acyclic graphs (DAGs). We provide a novel mixed-integer quadratic programming formulation and associated algorithm that identifies DAGs on up to 50 vertices, where these are identifiable. We call this method ExDAG, which stands for Exact learning of DAGs. Although there is a superexponential number of constraints that prevent the formation of cycles, the algorithm adds constraints violated by solutions found, rather than imposing all constraints in each continuous-valued relaxation. Our empirical results show that ExDAG outperforms local state-of-the-art solvers in terms of precision and outperforms state-of-the-art global solvers with respect to scaling, when considering Gaussian noise. We also provide validation with respect to other noise distributions.
Related papers
- Nonlinear Causal Discovery through a Sequential Edge Orientation Approach [5.807183284468881]
We propose a sequential procedure to orient undirected edges in a completed partial DAG.<n>We prove that this procedure can recover the true causal DAG assuming a restricted ANM.<n>We develop a novel constraint-based algorithm for learning causal DAGs under nonlinear ANMs.
arXiv Detail & Related papers (2025-06-05T21:08:13Z) - Analytic DAG Constraints for Differentiable DAG Learning [83.93320658222717]
We develop a theory to establish a connection between analytic functions and DAG constraints.
We show that analytic functions from the set $f(x) = c_0 + sum_i=1inftyc_ixi | forall i > 0, c_i > 0; r = lim_irightarrow inftyc_i/c_i+1 > 0$ can be employed to formulate effective DAG constraints.
arXiv Detail & Related papers (2025-03-24T23:51:35Z) - Gaussian Mixture Models Based Augmentation Enhances GNN Generalization [22.04352144324223]
We introduce a theoretical framework using Rademacher complexity to compute a regret bound on the generalization error.
This framework informs the design of GMM-GDA, an efficient graph data augmentation (GDA) algorithm.
arXiv Detail & Related papers (2024-11-13T14:26:04Z) - Generalization of Graph Neural Networks is Robust to Model Mismatch [84.01980526069075]
Graph neural networks (GNNs) have demonstrated their effectiveness in various tasks supported by their generalization capabilities.
In this paper, we examine GNNs that operate on geometric graphs generated from manifold models.
Our analysis reveals the robustness of the GNN generalization in the presence of such model mismatch.
arXiv Detail & Related papers (2024-08-25T16:00:44Z) - Learning Directed Acyclic Graphs from Partial Orderings [9.387234607473054]
directed acyclic graphs (DAGs) are commonly used to model causal relationships among random variables.
In this paper, we consider the intermediate problem of learning DAGs when a partial causal ordering of variables is available.
We propose a general estimation framework for leveraging the partial ordering and present efficient estimation algorithms for low- and high-dimensional problems.
arXiv Detail & Related papers (2024-03-24T06:14:50Z) - Discovering Dynamic Causal Space for DAG Structure Learning [64.763763417533]
We propose a dynamic causal space for DAG structure learning, coined CASPER.
It integrates the graph structure into the score function as a new measure in the causal space to faithfully reflect the causal distance between estimated and ground truth DAG.
arXiv Detail & Related papers (2023-06-05T12:20:40Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Semi-Supervised Heterogeneous Graph Learning with Multi-level Data
Augmentation [8.697773215048286]
This paper presents a novel method named Semi-Supervised Heterogeneous Graph Learning with Multi-level Data Augmentation (HG-MDA)
For the problem of heterogeneity of information in DA, node and topology augmentation strategies are proposed.
HG-MDA is applied to user identification in internet finance scenarios, helping the business to add 30% key users.
arXiv Detail & Related papers (2022-11-30T14:35:58Z) - Effect Identification in Cluster Causal Diagrams [51.42809552422494]
We introduce a new type of graphical model called cluster causal diagrams (for short, C-DAGs)
C-DAGs allow for the partial specification of relationships among variables based on limited prior knowledge.
We develop the foundations and machinery for valid causal inferences over C-DAGs.
arXiv Detail & Related papers (2022-02-22T21:27:31Z) - BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery [97.79015388276483]
A structural equation model (SEM) is an effective framework to reason over causal relationships represented via a directed acyclic graph (DAG)
Recent advances enabled effective maximum-likelihood point estimation of DAGs from observational data.
We propose BCD Nets, a variational framework for estimating a distribution over DAGs characterizing a linear-Gaussian SEM.
arXiv Detail & Related papers (2021-12-06T03:35:21Z) - Learning linear non-Gaussian directed acyclic graph with diverging
number of nodes [12.49848873864773]
Acyclic model, often depicted as a directed acyclic graph (DAG), has been widely employed to represent directional causal relations among collected nodes.
We propose an efficient method to learn linear non-Gaussian DAG in high dimensional cases, where the noises can be of any continuous non-Gaussian distribution.
arXiv Detail & Related papers (2021-11-01T07:34:53Z) - Cyclic Label Propagation for Graph Semi-supervised Learning [52.102251202186025]
We introduce a novel framework for graph semi-supervised learning called CycProp.
CycProp integrates GNNs into the process of label propagation in a cyclic and mutually reinforcing manner.
In particular, our proposed CycProp updates the node embeddings learned by GNN module with the augmented information by label propagation.
arXiv Detail & Related papers (2020-11-24T02:55:40Z) - Learning DAGs without imposing acyclicity [0.6526824510982799]
We show that it is possible to learn a directed acyclic graph (DAG) from data without imposing the acyclicity constraint.
This approach is computationally efficient and is not affected by the explosion of complexity as in classical structural learning algorithms.
arXiv Detail & Related papers (2020-06-04T16:52:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.