Moment-Matching Graph-Networks for Causal Inference
- URL: http://arxiv.org/abs/2007.10507v2
- Date: Mon, 27 Jul 2020 13:13:40 GMT
- Title: Moment-Matching Graph-Networks for Causal Inference
- Authors: Michael Park
- Abstract summary: This note explores a fully unsupervised deep-learning framework for simulating non-linear structural equation models from observational training data.
The main contribution of this note is an architecture for applying moment-matching loss functions to the edges of a causal Bayesian graph, resulting in a generative conditional-moment-matching graph-neural-network.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this note we explore a fully unsupervised deep-learning framework for
simulating non-linear structural equation models from observational training
data. The main contribution of this note is an architecture for applying
moment-matching loss functions to the edges of a causal Bayesian graph,
resulting in a generative conditional-moment-matching graph-neural-network.
This framework thus enables automated sampling of latent space conditional
probability distributions for various graphical interventions, and is capable
of generating out-of-sample interventional probabilities that are often
faithful to the ground truth distributions well beyond the range contained in
the training set. These methods could in principle be used in conjunction with
any existing autoencoder that produces a latent space representation containing
causal graph structures.
Related papers
- Binary Flow Matching: Prediction-Loss Space Alignment for Robust Learning [23.616336786063552]
Flow matching has emerged as a powerful framework for generative modeling.<n>We identify a latent structural mismatch that arises when it is coupled with velocity-based objectives.<n>We prove that re-aligning the objective to the signal space eliminates the singular weighting.
arXiv Detail & Related papers (2026-02-11T02:02:30Z) - Topology Identification and Inference over Graphs [61.06365536861156]
Topology identification and inference of processes evolving over graphs arise in timely applications involving brain, transportation, financial, power, as well as social and information networks.<n>This chapter provides an overview of graph topology identification and statistical inference methods for multidimensional data.
arXiv Detail & Related papers (2025-12-11T00:47:09Z) - QGraphLIME - Explaining Quantum Graph Neural Networks [0.48998185508205744]
Quantum graph neural networks offer a powerful paradigm for learning on graph-structured data.<n>QuantumGraphLIME treats model explanations as distributions over local surrogates fit on structure-preserving perturbations of a graph.
arXiv Detail & Related papers (2025-10-07T08:39:13Z) - Nonparametric learning of heterogeneous graphical model on network-linked data [19.215806260939473]
This paper proposes a nonparametric graphical model that accommodates heterogeneous graph structures without imposing any distributional assumptions.<n>It transforms the graph learning task into solving a finite-dimensional linear equation system by leveraging the properties of vector-valued kernel Hilbert space.<n>Its effectiveness is also demonstrated through a variety of simulated examples and a real application to the statistician coauthorship dataset.
arXiv Detail & Related papers (2025-07-02T08:37:15Z) - Generative Flow Networks: Theory and Applications to Structure Learning [7.6872614776094]
This thesis studies the problem of structure learning from a Bayesian perspective.
It introduces Generative Flow Networks (GFlowNets)
GFlowNets treat generation as a sequential decision making problem.
arXiv Detail & Related papers (2025-01-09T17:47:17Z) - Sub-graph Based Diffusion Model for Link Prediction [43.15741675617231]
Denoising Diffusion Probabilistic Models (DDPMs) represent a contemporary class of generative models with exceptional qualities.
We build a novel generative model for link prediction using a dedicated design to decompose the likelihood estimation process via the Bayesian formula.
Our proposed method presents numerous advantages: (1) transferability across datasets without retraining, (2) promising generalization on limited training data, and (3) robustness against graph adversarial attacks.
arXiv Detail & Related papers (2024-09-13T02:23:55Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Bayesian Graph Contrastive Learning [55.36652660268726]
We propose a novel perspective of graph contrastive learning methods showing random augmentations leads to encoders.
Our proposed method represents each node by a distribution in the latent space in contrast to existing techniques which embed each node to a deterministic vector.
We show a considerable improvement in performance compared to existing state-of-the-art methods on several benchmark datasets.
arXiv Detail & Related papers (2021-12-15T01:45:32Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Graph Belief Propagation Networks [34.137798598227874]
We introduce a model that combines the advantages of graph neural networks and collective classification.
In our model, potentials on each node only depend on that node's features, and edge potentials are learned via a coupling matrix.
Our approach can be viewed as either an interpretable message-passing graph neural network or a collective classification method with higher capacity and modernized training.
arXiv Detail & Related papers (2021-06-06T05:24:06Z) - Learning non-Gaussian graphical models via Hessian scores and triangular
transport [6.308539010172309]
We propose an algorithm for learning the Markov structure of continuous and non-Gaussian distributions.
Our algorithm SING estimates the density using a deterministic coupling, induced by a triangular transport map, and iteratively exploits sparse structure in the map to reveal sparsity in the graph.
arXiv Detail & Related papers (2021-01-08T16:42:42Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.