Bag of Tricks of Semi-Supervised Classification with Graph Neural
Networks
- URL: http://arxiv.org/abs/2103.13355v1
- Date: Wed, 24 Mar 2021 17:24:26 GMT
- Title: Bag of Tricks of Semi-Supervised Classification with Graph Neural
Networks
- Authors: Yangkun Wang
- Abstract summary: In this paper, we first summarize a collection of existing refinements, and then propose several novel techniques regarding these model designs and label usage.
We empirically evaluate their impacts on the final model accuracy through ablation studies, and show that we are able to significantly improve various GNN models to the extent that they outweigh the gains from model architecture improvement.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Much of the recent progress made in node classification on graphs can be
credited to the careful design on graph neural networks (GNN) and label
propagation algorithms. However, in the literature, in addition to improvements
to the model architecture, there are a number of improvements either briefly
mentioned as implementation details or visible only in source code, and these
overlooked techniques may play a pivotal role in their practical use. In this
paper, we first summarize a collection of existing refinements, and then
propose several novel techniques regarding these model designs and label usage.
We empirically evaluate their impacts on the final model accuracy through
ablation studies, and show that we are able to significantly improve various
GNN models to the extent that they outweigh the gains from model architecture
improvement. Notably, many of the top-ranked models on Open Graph Benchmark
benefit from our techniques.
Related papers
- Towards Graph Foundation Models: A Survey and Beyond [66.37994863159861]
Foundation models have emerged as critical components in a variety of artificial intelligence applications.
The capabilities of foundation models to generalize and adapt motivate graph machine learning researchers to discuss the potential of developing a new graph learning paradigm.
This article introduces the concept of Graph Foundation Models (GFMs), and offers an exhaustive explanation of their key characteristics and underlying technologies.
arXiv Detail & Related papers (2023-10-18T09:31:21Z) - Self-supervision meets kernel graph neural models: From architecture to
augmentations [36.388069423383286]
We improve the design and learning of kernel graph neural networks (KGNNs)
We develop a novel structure-preserving graph data augmentation method called latent graph augmentation (LGA)
Our proposed model achieves competitive performance comparable to or sometimes outperforming state-of-the-art graph representation learning frameworks.
arXiv Detail & Related papers (2023-10-17T14:04:22Z) - Challenging the Myth of Graph Collaborative Filtering: a Reasoned and Reproducibility-driven Analysis [50.972595036856035]
We present a code that successfully replicates results from six popular and recent graph recommendation models.
We compare these graph models with traditional collaborative filtering models that historically performed well in offline evaluations.
By investigating the information flow from users' neighborhoods, we aim to identify which models are influenced by intrinsic features in the dataset structure.
arXiv Detail & Related papers (2023-08-01T09:31:44Z) - Application of Graph Neural Networks and graph descriptors for graph
classification [0.0]
We focus on Graph Neural Networks (GNNs), which emerged as a de facto standard deep learning technique for graph representation learning.
We design fair evaluation experimental protocol and choose proper datasets collection.
We arrive to many conclusions, which shed new light on performance and quality of novel algorithms.
arXiv Detail & Related papers (2022-11-07T16:25:22Z) - An Empirical Study of Retrieval-enhanced Graph Neural Networks [48.99347386689936]
Graph Neural Networks (GNNs) are effective tools for graph representation learning.
We propose a retrieval-enhanced scheme called GRAPHRETRIEVAL, which is agnostic to the choice of graph neural network models.
We conduct comprehensive experiments over 13 datasets, and we observe that GRAPHRETRIEVAL is able to reach substantial improvements over existing GNNs.
arXiv Detail & Related papers (2022-06-01T09:59:09Z) - Improving Subgraph Representation Learning via Multi-View Augmentation [6.907772294522709]
Subgraph representation learning based on Graph Neural Network (GNN) has broad applications in chemistry and biology.
We develop a novel multiview augmentation mechanism to improve subgraph representation learning and thus the accuracy of downstream prediction tasks.
arXiv Detail & Related papers (2022-05-25T20:17:13Z) - A Simple Yet Effective Pretraining Strategy for Graph Few-shot Learning [38.66690010054665]
We propose a simple transductive fine-tuning based framework as a new paradigm for graph few-shot learning.
For pretraining, we propose a supervised contrastive learning framework with data augmentation strategies specific for few-shot node classification.
arXiv Detail & Related papers (2022-03-29T22:30:00Z) - Sparsifying the Update Step in Graph Neural Networks [15.446125349239534]
We study the effect of sparsification on the trainable part of MPNNs known as the Update step.
Specifically, we propose the ExpanderGNN model with a tuneable sparsification rate and the Activation-Only GNN, which has no linear transform in the Update step.
Our novel benchmark models enable a better understanding of the influence of the Update step on model performance.
arXiv Detail & Related papers (2021-09-02T13:06:34Z) - Model-Agnostic Graph Regularization for Few-Shot Learning [60.64531995451357]
We present a comprehensive study on graph embedded few-shot learning.
We introduce a graph regularization approach that allows a deeper understanding of the impact of incorporating graph information between labels.
Our approach improves the performance of strong base learners by up to 2% on Mini-ImageNet and 6.7% on ImageNet-FS.
arXiv Detail & Related papers (2021-02-14T05:28:13Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Node Masking: Making Graph Neural Networks Generalize and Scale Better [71.51292866945471]
Graph Neural Networks (GNNs) have received a lot of interest in the recent times.
In this paper, we utilize some theoretical tools to better visualize the operations performed by state of the art spatial GNNs.
We introduce a simple concept, Node Masking, that allows them to generalize and scale better.
arXiv Detail & Related papers (2020-01-17T06:26:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.