Graph Belief Propagation Networks
- URL: http://arxiv.org/abs/2106.03033v1
- Date: Sun, 6 Jun 2021 05:24:06 GMT
- Title: Graph Belief Propagation Networks
- Authors: Junteng Jia, Cenk Baykal, Vamsi K. Potluru, Austin R. Benson
- Abstract summary: We introduce a model that combines the advantages of graph neural networks and collective classification.
In our model, potentials on each node only depend on that node's features, and edge potentials are learned via a coupling matrix.
Our approach can be viewed as either an interpretable message-passing graph neural network or a collective classification method with higher capacity and modernized training.
- Score: 34.137798598227874
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the wide-spread availability of complex relational data, semi-supervised
node classification in graphs has become a central machine learning problem.
Graph neural networks are a recent class of easy-to-train and accurate methods
for this problem that map the features in the neighborhood of a node to its
label, but they ignore label correlation during inference and their predictions
are difficult to interpret. On the other hand, collective classification is a
traditional approach based on interpretable graphical models that explicitly
model label correlations. Here, we introduce a model that combines the
advantages of these two approaches, where we compute the marginal probabilities
in a conditional random field, similar to collective classification, and the
potentials in the random field are learned through end-to-end training, akin to
graph neural networks. In our model, potentials on each node only depend on
that node's features, and edge potentials are learned via a coupling matrix.
This structure enables simple training with interpretable parameters, scales to
large networks, naturally incorporates training labels at inference, and is
often more accurate than related approaches. Our approach can be viewed as
either an interpretable message-passing graph neural network or a collective
classification method with higher capacity and modernized training.
Related papers
- Federated Graph Semantic and Structural Learning [54.97668931176513]
This paper reveals that local client distortion is brought by both node-level semantics and graph-level structure.
We postulate that a well-structural graph neural network possesses similarity for neighbors due to the inherent adjacency relationships.
We transform the adjacency relationships into the similarity distribution and leverage the global model to distill the relation knowledge into the local model.
arXiv Detail & Related papers (2024-06-27T07:08:28Z) - GNN-LoFI: a Novel Graph Neural Network through Localized Feature-based
Histogram Intersection [51.608147732998994]
Graph neural networks are increasingly becoming the framework of choice for graph-based machine learning.
We propose a new graph neural network architecture that substitutes classical message passing with an analysis of the local distribution of node features.
arXiv Detail & Related papers (2024-01-17T13:04:23Z) - On Discprecncies between Perturbation Evaluations of Graph Neural
Network Attributions [49.8110352174327]
We assess attribution methods from a perspective not previously explored in the graph domain: retraining.
The core idea is to retrain the network on important (or not important) relationships as identified by the attributions.
We run our analysis on four state-of-the-art GNN attribution methods and five synthetic and real-world graph classification datasets.
arXiv Detail & Related papers (2024-01-01T02:03:35Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Supervised Contrastive Learning with Structure Inference for Graph
Classification [5.276232626689567]
We propose a graph neural network based on supervised contrastive learning and structure inference for graph classification.
With the integration of label information, the one-vs-many contrastive learning can be extended to a many-vs-many setting.
Experiment results show the effectiveness of the proposed method compared with recent state-of-the-art methods.
arXiv Detail & Related papers (2022-03-15T07:18:46Z) - Neighborhood Random Walk Graph Sampling for Regularized Bayesian Graph
Convolutional Neural Networks [0.6236890292833384]
In this paper, we propose a novel algorithm called Bayesian Graph Convolutional Network using Neighborhood Random Walk Sampling (BGCN-NRWS)
BGCN-NRWS uses a Markov Chain Monte Carlo (MCMC) based graph sampling algorithm utilizing graph structure, reduces overfitting by using a variational inference layer, and yields consistently competitive classification results compared to the state-of-the-art in semi-supervised node classification.
arXiv Detail & Related papers (2021-12-14T20:58:27Z) - Inference Graphs for CNN Interpretation [12.765543440576144]
Convolutional neural networks (CNNs) have achieved superior accuracy in many visual related tasks.
We propose to model the network hidden layers activity using probabilistic models.
We show that such graphs are useful for understanding the general inference process of a class, as well as explaining decisions the network makes regarding specific images.
arXiv Detail & Related papers (2021-10-20T13:56:09Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - A Unifying Generative Model for Graph Learning Algorithms: Label
Propagation, Graph Convolutions, and Combinations [39.8498896531672]
Semi-supervised learning on graphs is a widely applicable problem in network science and machine learning.
We develop a Markov random field model for the data generation process of node attributes.
We show that label propagation, a linearized graph convolutional network, and their combination can all be derived as conditional expectations.
arXiv Detail & Related papers (2021-01-19T17:07:08Z) - Interpreting Graph Neural Networks for NLP With Differentiable Edge
Masking [63.49779304362376]
Graph neural networks (GNNs) have become a popular approach to integrating structural inductive biases into NLP models.
We introduce a post-hoc method for interpreting the predictions of GNNs which identifies unnecessary edges.
We show that we can drop a large proportion of edges without deteriorating the performance of the model.
arXiv Detail & Related papers (2020-10-01T17:51:19Z) - Residual Correlation in Graph Neural Network Regression [39.54530450932135]
We show that conditional independence assumption severely limits predictive power.
We address this problem with an interpretable and efficient framework.
Our framework achieves substantially higher accuracy than competing baselines.
arXiv Detail & Related papers (2020-02-19T16:32:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.