AutoHEnsGNN: Winning Solution to AutoGraph Challenge for KDD Cup 2020
- URL: http://arxiv.org/abs/2111.12952v1
- Date: Thu, 25 Nov 2021 07:23:44 GMT
- Title: AutoHEnsGNN: Winning Solution to AutoGraph Challenge for KDD Cup 2020
- Authors: Jin Xu, Mingjian Chen, Jianqiang Huang, Xingyuan Tang, Ke Hu, Jian Li,
Jia Cheng, Jun Lei
- Abstract summary: We present AutoHEnsGNN, a framework to build effective and robust models for graph tasks without any human intervention.
AutoHEnsGNN won first place in the AutoGraph Challenge for KDD Cup 2020.
- Score: 29.511523832243046
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) have become increasingly popular and achieved
impressive results in many graph-based applications. However, extensive manual
work and domain knowledge are required to design effective architectures, and
the results of GNN models have high variance with different training setups,
which limits the application of existing GNN models. In this paper, we present
AutoHEnsGNN, a framework to build effective and robust models for graph tasks
without any human intervention. AutoHEnsGNN won first place in the AutoGraph
Challenge for KDD Cup 2020, and achieved the best rank score of five real-life
datasets in the final phase. Given a task, AutoHEnsGNN first applies a fast
proxy evaluation to automatically select a pool of promising GNN models. Then
it builds a hierarchical ensemble framework: 1) We propose graph self-ensemble
(GSE), which can reduce the variance of weight initialization and efficiently
exploit the information of local and global neighborhoods; 2) Based on GSE, a
weighted ensemble of different types of GNN models is used to effectively learn
more discriminative node representations. To efficiently search the
architectures and ensemble weights, we propose AutoHEnsGNN$_{\text{Gradient}}$,
which treats the architectures and ensemble weights as architecture parameters
and uses gradient-based architecture search to obtain optimal configurations,
and AutoHEnsGNN$_{\text{Adaptive}}$, which can adaptively adjust the ensemble
weight based on the model accuracy. Extensive experiments on node
classification, graph classification, edge prediction and KDD Cup challenge
demonstrate the effectiveness and generality of AutoHEnsGNN
Related papers
- Diffusing to the Top: Boost Graph Neural Networks with Minimal Hyperparameter Tuning [33.948899558876604]
This work introduces a graph-conditioned latent diffusion framework (GNN-Diff) to generate high-performing GNNs.
We validate our method through 166 experiments across four graph tasks: node classification on small, large, and long-range graphs, as well as link prediction.
arXiv Detail & Related papers (2024-10-08T05:27:34Z) - Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - AutoGEL: An Automated Graph Neural Network with Explicit Link
Information [7.525545233605658]
We present a novel AutoGNN work that explicitly models the link information, abbreviated to AutoGEL.
In such a way, AutoGEL can handle the link prediction task and improve the performance of AutoGNNs on the node classification and graph classification task.
arXiv Detail & Related papers (2021-12-02T09:09:18Z) - Network In Graph Neural Network [9.951298152023691]
We present a model-agnostic methodology that allows arbitrary GNN models to increase their model capacity by making the model deeper.
Instead of adding or widening GNN layers, NGNN deepens a GNN model by inserting non-linear feedforward neural network layer(s) within each GNN layer.
arXiv Detail & Related papers (2021-11-23T03:58:56Z) - A Unified Lottery Ticket Hypothesis for Graph Neural Networks [82.31087406264437]
We present a unified GNN sparsification (UGS) framework that simultaneously prunes the graph adjacency matrix and the model weights.
We further generalize the popular lottery ticket hypothesis to GNNs for the first time, by defining a graph lottery ticket (GLT) as a pair of core sub-dataset and sparse sub-network.
arXiv Detail & Related papers (2021-02-12T21:52:43Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - GPT-GNN: Generative Pre-Training of Graph Neural Networks [93.35945182085948]
Graph neural networks (GNNs) have been demonstrated to be powerful in modeling graph-structured data.
We present the GPT-GNN framework to initialize GNNs by generative pre-training.
We show that GPT-GNN significantly outperforms state-of-the-art GNN models without pre-training by up to 9.1% across various downstream tasks.
arXiv Detail & Related papers (2020-06-27T20:12:33Z) - Self-Enhanced GNN: Improving Graph Neural Networks Using Model Outputs [20.197085398581397]
Graph neural networks (GNNs) have received much attention recently because of their excellent performance on graph-based tasks.
We propose self-enhanced GNN (SEG), which improves the quality of the input data using the outputs of existing GNN models.
SEG consistently improves the performance of well-known GNN models such as GCN, GAT and SGC across different datasets.
arXiv Detail & Related papers (2020-02-18T12:27:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.