Revisiting graph neural networks and distance encoding from a practical
view
- URL: http://arxiv.org/abs/2011.12228v3
- Date: Sun, 29 Nov 2020 19:31:29 GMT
- Title: Revisiting graph neural networks and distance encoding from a practical
view
- Authors: Haoteng Yin, Yanbang Wang, Pan Li
- Abstract summary: Graph neural networks (GNNs) are widely used in the applications based on graph structured data, such as node classification and link prediction.
A recently proposed technique distance encoding (DE) makes GNNs work well in many applications, including node classification and link prediction.
- Score: 10.193375978547019
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) are widely used in the applications based on
graph structured data, such as node classification and link prediction.
However, GNNs are often used as a black-box tool and rarely get in-depth
investigated regarding whether they fit certain applications that may have
various properties. A recently proposed technique distance encoding (DE) (Li et
al. 2020) magically makes GNNs work well in many applications, including node
classification and link prediction. The theory provided in (Li et al. 2020)
supports DE by proving that DE improves the representation power of GNNs.
However, it is not obvious how the theory assists the applications accordingly.
Here, we revisit GNNs and DE from a more practical point of view. We want to
explain how DE makes GNNs fit for node classification and link prediction.
Specifically, for link prediction, DE can be viewed as a way to establish
correlations between a pair of node representations. For node classification,
the problem becomes more complicated as different classification tasks may hold
node labels that indicate different physical meanings. We focus on the most
widely-considered node classification scenarios and categorize the node labels
into two types, community type and structure type, and then analyze different
mechanisms that GNNs adopt to predict these two types of labels. We also run
extensive experiments to compare eight different configurations of GNNs paired
with DE to predict node labels over eight real-world graphs. The results
demonstrate the uniform effectiveness of DE to predict structure-type labels.
Lastly, we reach three pieces of conclusions on how to use GNNs and DE properly
in tasks of node classification.
Related papers
- GNN-MultiFix: Addressing the pitfalls for GNNs for multi-label node classification [1.857645719601748]
Graph neural networks (GNNs) have emerged as powerful models for learning representations of graph data.
We show that even the most expressive GNN may fail to learn in absence of node attributes and without using explicit label information as input.
We propose a straightforward approach, referred to as GNN-MultiFix, that integrates the feature, label, and positional information of a node.
arXiv Detail & Related papers (2024-11-21T12:59:39Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Hierarchical Model Selection for Graph Neural Netoworks [0.0]
We propose a hierarchical model selection framework (HMSF) that selects an appropriate graph neural network (GNN) model by analyzing the indicators of each graph data.
In the experiment, we show that the model selected by our HMSF achieves high performance on node classification for various types of graph data.
arXiv Detail & Related papers (2022-12-01T22:31:21Z) - Every Node Counts: Improving the Training of Graph Neural Networks on
Node Classification [9.539495585692007]
We propose novel objective terms for the training of GNNs for node classification.
Our first term seeks to maximize the mutual information between node and label features.
Our second term promotes anisotropic smoothness in the prediction maps.
arXiv Detail & Related papers (2022-11-29T23:25:14Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - Towards Self-Explainable Graph Neural Network [24.18369781999988]
Graph Neural Networks (GNNs) generalize the deep neural networks to graph-structured data.
GNNs lack explainability, which limits their adoption in scenarios that demand the transparency of models.
We propose a new framework which can find $K$-nearest labeled nodes for each unlabeled node to give explainable node classification.
arXiv Detail & Related papers (2021-08-26T22:45:11Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Unifying Graph Convolutional Neural Networks and Label Propagation [73.82013612939507]
We study the relationship between LPA and GCN in terms of two aspects: feature/label smoothing and feature/label influence.
Based on our theoretical analysis, we propose an end-to-end model that unifies GCN and LPA for node classification.
Our model can also be seen as learning attention weights based on node labels, which is more task-oriented than existing feature-based attention models.
arXiv Detail & Related papers (2020-02-17T03:23:13Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.