Customizing Graph Neural Networks using Path Reweighting
- URL: http://arxiv.org/abs/2106.10866v3
- Date: Tue, 12 Mar 2024 08:14:31 GMT
- Title: Customizing Graph Neural Networks using Path Reweighting
- Authors: Jianpeng Chen and Yujing Wang and Ming Zeng and Zongyi Xiang and Bitan
Hou and Yunhai Tong and Ole J. Mengshoel and Yazhou Ren
- Abstract summary: We propose a novel GNN solution, namely Customized Graph Neural Network with Path Reweighting (CustomGNN for short)
Specifically, the proposed CustomGNN can automatically learn the high-level semantics for specific downstream tasks to highlight semantically relevant paths as well to filter out task-irrelevant noises in a graph.
In experiments with the node classification task, CustomGNN achieves state-of-the-art accuracies on three standard graph datasets and four large graph datasets.
- Score: 23.698877985105312
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have been extensively used for mining
graph-structured data with impressive performance. However, because these
traditional GNNs do not distinguish among various downstream tasks, embeddings
embedded by them are not always effective. Intuitively, paths in a graph imply
different semantics for different downstream tasks. Inspired by this, we design
a novel GNN solution, namely Customized Graph Neural Network with Path
Reweighting (CustomGNN for short). Specifically, the proposed CustomGNN can
automatically learn the high-level semantics for specific downstream tasks to
highlight semantically relevant paths as well to filter out task-irrelevant
noises in a graph. Furthermore, we empirically analyze the semantics learned by
CustomGNN and demonstrate its ability to avoid the three inherent problems in
traditional GNNs, i.e., over-smoothing, poor robustness, and overfitting. In
experiments with the node classification task, CustomGNN achieves
state-of-the-art accuracies on three standard graph datasets and four large
graph datasets. The source code of the proposed CustomGNN is available at
\url{https://github.com/cjpcool/CustomGNN}.
Related papers
- Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Search to Fine-tune Pre-trained Graph Neural Networks for Graph-level
Tasks [22.446655655309854]
Graph neural networks (GNNs) have shown their unprecedented success in many graph-related tasks.
Recent efforts try to pre-train GNNs on a large-scale unlabeled graph and adapt the knowledge from the unlabeled graph to the target downstream task.
Despite the importance of fine-tuning, current GNNs pre-training works often ignore designing a good fine-tuning strategy.
arXiv Detail & Related papers (2023-08-14T06:32:02Z) - Geodesic Graph Neural Network for Efficient Graph Representation
Learning [34.047527874184134]
We propose an efficient GNN framework called Geodesic GNN (GDGNN)
It injects conditional relationships between nodes into the model without labeling.
Conditioned on the geodesic representations, GDGNN is able to generate node, link, and graph representations that carry much richer structural information than plain GNNs.
arXiv Detail & Related papers (2022-10-06T02:02:35Z) - Neo-GNNs: Neighborhood Overlap-aware Graph Neural Networks for Link
Prediction [23.545059901853815]
Graph Neural Networks (GNNs) have been widely applied to various fields for learning over graphstructured data.
We propose Neighborhood Overlap-aware Graph Neural Networks (Neo-GNNs) that learn useful structural features from an adjacency overlapped neighborhoods for link prediction.
arXiv Detail & Related papers (2022-06-09T01:43:49Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - KerGNNs: Interpretable Graph Neural Networks with Graph Kernels [14.421535610157093]
Graph neural networks (GNNs) have become the state-of-the-art method in downstream graph-related tasks.
We propose a novel GNN framework, termed textit Kernel Graph Neural Networks (KerGNNs)
KerGNNs integrate graph kernels into the message passing process of GNNs.
We show that our method achieves competitive performance compared with existing state-of-the-art methods.
arXiv Detail & Related papers (2022-01-03T06:16:30Z) - AutoGraph: Automated Graph Neural Network [45.94642721490744]
We propose a method to automate the deep Graph Neural Networks (GNNs) design.
In our proposed method, we add a new type of skip connection to the GNNs search space to encourage feature reuse.
We also allow our evolutionary algorithm to increase the layers of GNNs during the evolution to generate deeper networks.
arXiv Detail & Related papers (2020-11-23T09:04:17Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - GPT-GNN: Generative Pre-Training of Graph Neural Networks [93.35945182085948]
Graph neural networks (GNNs) have been demonstrated to be powerful in modeling graph-structured data.
We present the GPT-GNN framework to initialize GNNs by generative pre-training.
We show that GPT-GNN significantly outperforms state-of-the-art GNN models without pre-training by up to 9.1% across various downstream tasks.
arXiv Detail & Related papers (2020-06-27T20:12:33Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.