Neural Enhanced Belief Propagation on Factor Graphs
- URL: http://arxiv.org/abs/2003.01998v5
- Date: Tue, 16 Mar 2021 14:51:58 GMT
- Title: Neural Enhanced Belief Propagation on Factor Graphs
- Authors: Victor Garcia Satorras, Max Welling
- Abstract summary: A graphical model is a structured representation of locally dependent random variables.
We first extend graph neural networks to factor graphs (FG-GNN)
We then propose a new hybrid model that runs conjointly a FG-GNN with belief propagation.
- Score: 85.61562052281688
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A graphical model is a structured representation of locally dependent random
variables. A traditional method to reason over these random variables is to
perform inference using belief propagation. When provided with the true data
generating process, belief propagation can infer the optimal posterior
probability estimates in tree structured factor graphs. However, in many cases
we may only have access to a poor approximation of the data generating process,
or we may face loops in the factor graph, leading to suboptimal estimates. In
this work we first extend graph neural networks to factor graphs (FG-GNN). We
then propose a new hybrid model that runs conjointly a FG-GNN with belief
propagation. The FG-GNN receives as input messages from belief propagation at
every inference iteration and outputs a corrected version of them. As a result,
we obtain a more accurate algorithm that combines the benefits of both belief
propagation and graph neural networks. We apply our ideas to error correction
decoding tasks, and we show that our algorithm can outperform belief
propagation for LDPC codes on bursty channels.
Related papers
- Learning to Reweight for Graph Neural Network [63.978102332612906]
Graph Neural Networks (GNNs) show promising results for graph tasks.
Existing GNNs' generalization ability will degrade when there exist distribution shifts between testing and training graph data.
We propose a novel nonlinear graph decorrelation method, which can substantially improve the out-of-distribution generalization ability.
arXiv Detail & Related papers (2023-12-19T12:25:10Z) - Factor Graph Neural Networks [20.211455592922736]
Graph Neural Networks (GNNs) can learn powerful representations in an end-to-end fashion with great success in many real-world applications.
We propose Factor Graph Neural Networks (FGNNs) to effectively capture higher-order relations for inference and learning.
arXiv Detail & Related papers (2023-08-02T00:32:02Z) - Invertible Neural Networks for Graph Prediction [22.140275054568985]
In this work, we address conditional generation using deep invertible neural networks.
We adopt an end-to-end training approach since our objective is to address prediction and generation in the forward and backward processes at once.
arXiv Detail & Related papers (2022-06-02T17:28:33Z) - Discovering Invariant Rationales for Graph Neural Networks [104.61908788639052]
Intrinsic interpretability of graph neural networks (GNNs) is to find a small subset of the input graph's features.
We propose a new strategy of discovering invariant rationale (DIR) to construct intrinsically interpretable GNNs.
arXiv Detail & Related papers (2022-01-30T16:43:40Z) - Generalizing Graph Neural Networks on Out-Of-Distribution Graphs [51.33152272781324]
Graph Neural Networks (GNNs) are proposed without considering the distribution shifts between training and testing graphs.
In such a setting, GNNs tend to exploit subtle statistical correlations existing in the training set for predictions, even though it is a spurious correlation.
We propose a general causal representation framework, called StableGNN, to eliminate the impact of spurious correlations.
arXiv Detail & Related papers (2021-11-20T18:57:18Z) - Robust Counterfactual Explanations on Graph Neural Networks [42.91881080506145]
Massive deployment of Graph Neural Networks (GNNs) in high-stake applications generates a strong demand for explanations that are robust to noise.
Most existing methods generate explanations by identifying a subgraph of an input graph that has a strong correlation with the prediction.
We propose a novel method to generate robust counterfactual explanations on GNNs by explicitly modelling the common decision logic of GNNs on similar input graphs.
arXiv Detail & Related papers (2021-07-08T19:50:00Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.