Evolutionary Architecture Search for Graph Neural Networks
- URL: http://arxiv.org/abs/2009.10199v1
- Date: Mon, 21 Sep 2020 22:11:53 GMT
- Title: Evolutionary Architecture Search for Graph Neural Networks
- Authors: Min Shi, David A.Wilson, Xingquan Zhu, Yu Huang, Yuan Zhuang, Jianxun
Liu and Yufei Tang
- Abstract summary: We propose a novel AutoML framework through the evolution of individual models in a large Graph Neural Networks (GNN) architecture space.
To the best of our knowledge, this is the first work to introduce and evaluate evolutionary architecture search for GNN models.
- Score: 23.691915813153496
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automated machine learning (AutoML) has seen a resurgence in interest with
the boom of deep learning over the past decade. In particular, Neural
Architecture Search (NAS) has seen significant attention throughout the AutoML
research community, and has pushed forward the state-of-the-art in a number of
neural models to address grid-like data such as texts and images. However, very
litter work has been done about Graph Neural Networks (GNN) learning on
unstructured network data. Given the huge number of choices and combinations of
components such as aggregator and activation function, determining the suitable
GNN structure for a specific problem normally necessitates tremendous expert
knowledge and laborious trails. In addition, the slight variation of hyper
parameters such as learning rate and dropout rate could dramatically hurt the
learning capacity of GNN. In this paper, we propose a novel AutoML framework
through the evolution of individual models in a large GNN architecture space
involving both neural structures and learning parameters. Instead of optimizing
only the model structures with fixed parameter settings as existing work, an
alternating evolution process is performed between GNN structures and learning
parameters to dynamically find the best fit of each other. To the best of our
knowledge, this is the first work to introduce and evaluate evolutionary
architecture search for GNN models. Experiments and validations demonstrate
that evolutionary NAS is capable of matching existing state-of-the-art
reinforcement learning approaches for both the semi-supervised transductive and
inductive node representation learning and classification.
Related papers
- Hyperbolic Benchmarking Unveils Network Topology-Feature Relationship in GNN Performance [0.5416466085090772]
We introduce a comprehensive benchmarking framework for graph machine learning.
We generate synthetic networks with realistic topological properties and node feature vectors.
Results highlight the dependency of model performance on the interplay between network structure and node features.
arXiv Detail & Related papers (2024-06-04T20:40:06Z) - Learning Invariant Representations of Graph Neural Networks via Cluster
Generalization [58.68231635082891]
Graph neural networks (GNNs) have become increasingly popular in modeling graph-structured data.
In this paper, we experimentally find that the performance of GNNs drops significantly when the structure shift happens.
We propose the Cluster Information Transfer (CIT) mechanism, which can learn invariant representations for GNNs.
arXiv Detail & Related papers (2024-03-06T10:36:56Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Auto-HeG: Automated Graph Neural Network on Heterophilic Graphs [62.665761463233736]
We propose an automated graph neural network on heterophilic graphs, namely Auto-HeG, to automatically build heterophilic GNN models.
Specifically, Auto-HeG incorporates heterophily into all stages of automatic heterophilic graph learning, including search space design, supernet training, and architecture selection.
arXiv Detail & Related papers (2023-02-23T22:49:56Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - A Self-adaptive Neuroevolution Approach to Constructing Deep Neural
Network Architectures Across Different Types [5.429458930060452]
We propose a self-adaptive neuroevolution (SANE) approach to automatically construct various lightweight Deep Neural Network (DNN) architectures for different tasks.
One of the key settings in SANE is the search space defined by cells and organs self-adapted to different DNN types.
SANE is able to self-adaptively adjust evolution exploration and exploitation to improve search efficiency.
arXiv Detail & Related papers (2022-11-27T07:40:25Z) - Neural Architecture Search based on Cartesian Genetic Programming Coding
Method [6.519170476143571]
We propose an evolutionary approach of NAS based on CGP, called CGPNAS, to solve sentence classification task.
The experimental results show that the searched architectures are comparable with the performance of human-designed architectures.
arXiv Detail & Related papers (2021-03-12T09:51:03Z) - Differentiable Neural Architecture Learning for Efficient Neural Network
Design [31.23038136038325]
We introduce a novel emph architecture parameterisation based on scaled sigmoid function.
We then propose a general emphiable Neural Architecture Learning (DNAL) method to optimize the neural architecture without the need to evaluate candidate neural networks.
arXiv Detail & Related papers (2021-03-03T02:03:08Z) - Simplifying Architecture Search for Graph Neural Network [38.45540097927176]
We propose SNAG framework, consisting of a novel search space and a reinforcement learning based search algorithm.
Experiments on real-world datasets demonstrate the effectiveness of SNAG framework compared to human-designed GNNs and NAS methods.
arXiv Detail & Related papers (2020-08-26T16:24:03Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.