Differentiable Bayesian Structure Learning with Acyclicity Assurance
- URL: http://arxiv.org/abs/2309.01392v2
- Date: Wed, 6 Sep 2023 04:28:39 GMT
- Title: Differentiable Bayesian Structure Learning with Acyclicity Assurance
- Authors: Quang-Duy Tran, Phuoc Nguyen, Bao Duong, Thin Nguyen
- Abstract summary: We propose an alternative approach for strictly constraining the acyclicty of the graphs with an integration of the knowledge from the topological orderings.
Our approach can reduce inference complexity while ensuring the structures of the generated graphs to be acyclic.
- Score: 7.568978862189266
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Score-based approaches in the structure learning task are thriving because of
their scalability. Continuous relaxation has been the key reason for this
advancement. Despite achieving promising outcomes, most of these methods are
still struggling to ensure that the graphs generated from the latent space are
acyclic by minimizing a defined score. There has also been another trend of
permutation-based approaches, which concern the search for the topological
ordering of the variables in the directed acyclic graph in order to limit the
search space of the graph. In this study, we propose an alternative approach
for strictly constraining the acyclicty of the graphs with an integration of
the knowledge from the topological orderings. Our approach can reduce inference
complexity while ensuring the structures of the generated graphs to be acyclic.
Our empirical experiments with simulated and real-world data show that our
approach can outperform related Bayesian score-based approaches.
Related papers
- ExDBN: Exact learning of Dynamic Bayesian Networks [2.2499166814992435]
We propose a score-based learning approach for causal learning from data.
We show that the proposed approach turns out to produce excellent results when applied to small and medium-sized synthetic instances of up to 25 time-series.
Two interesting applications in bio-science and finance, to which the method is directly applied, further stress the opportunities in developing highly accurate, globally convergent solvers.
arXiv Detail & Related papers (2024-10-21T15:27:18Z) - Kernel-Based Differentiable Learning of Non-Parametric Directed Acyclic Graphical Models [17.52142371968811]
Causal discovery amounts to learning a directed acyclic graph (DAG) that encodes a causal model.
Recent research has sought to bypass the search by reformulating causal discovery as a continuous optimization problem.
arXiv Detail & Related papers (2024-08-20T16:09:40Z) - Integer Programming for Learning Directed Acyclic Graphs from Non-identifiable Gaussian Models [6.54203362045253]
We study the problem of learning directed acyclic graphs from continuous observational data.
We develop a mixed-integer programming framework for learning medium-sized problems.
Our method outperforms state-of-the-art algorithms and is robust to noise heteroscedasticity.
arXiv Detail & Related papers (2024-04-19T02:42:13Z) - Constraint-Free Structure Learning with Smooth Acyclic Orientations [16.556484521585197]
We introduce COSMO, a constraint-free continuous optimization scheme for acyclic structure learning.
Despite the absence of explicit constraints, we prove that COSMO always converges to an acyclic solution.
arXiv Detail & Related papers (2023-09-15T14:08:09Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Score matching enables causal discovery of nonlinear additive noise
models [63.93669924730725]
We show how to design a new generation of scalable causal discovery methods.
We propose a new efficient method for approximating the score's Jacobian, enabling to recover the causal graph.
arXiv Detail & Related papers (2022-03-08T21:34:46Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Efficient Neural Causal Discovery without Acyclicity Constraints [30.08586535981525]
We present ENCO, an efficient structure learning method for directed, acyclic causal graphs.
In experiments, we show that ENCO can efficiently recover graphs with hundreds of nodes, an order of magnitude larger than what was previously possible.
arXiv Detail & Related papers (2021-07-22T07:01:41Z) - DAGs with No Curl: An Efficient DAG Structure Learning Approach [62.885572432958504]
Recently directed acyclic graph (DAG) structure learning is formulated as a constrained continuous optimization problem with continuous acyclicity constraints.
We propose a novel learning framework to model and learn the weighted adjacency matrices in the DAG space directly.
We show that our method provides comparable accuracy but better efficiency than baseline DAG structure learning methods on both linear and generalized structural equation models.
arXiv Detail & Related papers (2021-06-14T07:11:36Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.