Bridging the Gap of AutoGraph between Academia and Industry: Analysing
AutoGraph Challenge at KDD Cup 2020
- URL: http://arxiv.org/abs/2204.02625v1
- Date: Wed, 6 Apr 2022 07:06:48 GMT
- Title: Bridging the Gap of AutoGraph between Academia and Industry: Analysing
AutoGraph Challenge at KDD Cup 2020
- Authors: Zhen Xu, Lanning Wei, Huan Zhao, Rex Ying, Quanming Yao, Wei-Wei Tu,
Isabelle Guyon
- Abstract summary: Graph Neural Networks (GNNs) have been proven to be effective in modeling graph structured data and many variants of GNN architectures have been proposed.
Researchers naturally adopt Automated Machine Learning on Graph Learning, aiming to reduce the human effort and achieve generally top-performing GNNs.
To understand GNN practitioners' automated solutions, we organized AutoGraph Challenge at KDD Cup 2020, emphasizing on automated graph neural networks for node classification.
- Score: 61.31176652211479
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph structured data is ubiquitous in daily life and scientific areas and
has attracted increasing attention. Graph Neural Networks (GNNs) have been
proved to be effective in modeling graph structured data and many variants of
GNN architectures have been proposed. However, much human effort is often
needed to tune the architecture depending on different datasets. Researchers
naturally adopt Automated Machine Learning on Graph Learning, aiming to reduce
the human effort and achieve generally top-performing GNNs, but their methods
focus more on the architecture search. To understand GNN practitioners'
automated solutions, we organized AutoGraph Challenge at KDD Cup 2020,
emphasizing on automated graph neural networks for node classification. We
received top solutions especially from industrial tech companies like Meituan,
Alibaba and Twitter, which are already open sourced on Github. After detailed
comparisons with solutions from academia, we quantify the gaps between academia
and industry on modeling scope, effectiveness and efficiency, and show that (1)
academia AutoML for Graph solutions focus on GNN architecture search while
industrial solutions, especially the winning ones in the KDD Cup, tend to
obtain an overall solution (2) by neural architecture search only, academia
solutions achieve on average 97.3% accuracy of industrial solutions (3)
academia solutions are cheap to obtain with several GPU hours while industrial
solutions take a few months' labors. Academic solutions also contain much fewer
parameters.
Related papers
- Combinatorial Optimization with Automated Graph Neural Networks [28.19349828026972]
We present a new class of textbfAUTOmated textbfGNNs for solving NP-hard CO problems, namely textbfAutoGNP.
The idea of AutoGNP is to use graph neural architecture search algorithms to automatically find the best GNNs for a given NP-hard optimization problem.
arXiv Detail & Related papers (2024-06-05T02:43:41Z) - Towards a General GNN Framework for Combinatorial Optimization [14.257210124854863]
We introduce a novel GNN architecture which leverages a complex filter bank and localized attention mechanisms designed to solve CO problems on graphs.
We show how our method differentiates itself from prior GNN-based CO solvers and how it can be effectively applied to the maximum clique, minimum dominating set, and maximum cut problems.
arXiv Detail & Related papers (2024-05-31T00:02:07Z) - The Evolution of Distributed Systems for Graph Neural Networks and their
Origin in Graph Processing and Deep Learning: A Survey [17.746899445454048]
Graph Neural Networks (GNNs) are an emerging research field.
GNNs can be applied to various domains including recommendation systems, computer vision, natural language processing, biology and chemistry.
We aim to fill this gap by summarizing and categorizing important methods and techniques for large-scale GNN solutions.
arXiv Detail & Related papers (2023-05-23T09:22:33Z) - Distributed Graph Neural Network Training: A Survey [51.77035975191926]
Graph neural networks (GNNs) are a type of deep learning models that are trained on graphs and have been successfully applied in various domains.
Despite the effectiveness of GNNs, it is still challenging for GNNs to efficiently scale to large graphs.
As a remedy, distributed computing becomes a promising solution of training large-scale GNNs.
arXiv Detail & Related papers (2022-11-01T01:57:00Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - PaSca: a Graph Neural Architecture Search System under the Scalable
Paradigm [24.294196319217907]
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph-based tasks.
However, GNNs do not scale well to data size and message passing steps.
This paper proposes PasCa, a new paradigm and system that offers a principled approach to systemically construct and explore the design space for scalable GNNs.
arXiv Detail & Related papers (2022-03-01T17:26:50Z) - AutoHEnsGNN: Winning Solution to AutoGraph Challenge for KDD Cup 2020 [29.511523832243046]
We present AutoHEnsGNN, a framework to build effective and robust models for graph tasks without any human intervention.
AutoHEnsGNN won first place in the AutoGraph Challenge for KDD Cup 2020.
arXiv Detail & Related papers (2021-11-25T07:23:44Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Scaling Graph Neural Networks with Approximate PageRank [64.92311737049054]
We present the PPRGo model which utilizes an efficient approximation of information diffusion in GNNs.
In addition to being faster, PPRGo is inherently scalable, and can be trivially parallelized for large datasets like those found in industry settings.
We show that training PPRGo and predicting labels for all nodes in this graph takes under 2 minutes on a single machine, far outpacing other baselines on the same graph.
arXiv Detail & Related papers (2020-07-03T09:30:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.