A Generic Graph-based Neural Architecture Encoding Scheme for
Predictor-based NAS
- URL: http://arxiv.org/abs/2004.01899v3
- Date: Tue, 1 Sep 2020 01:06:51 GMT
- Title: A Generic Graph-based Neural Architecture Encoding Scheme for
Predictor-based NAS
- Authors: Xuefei Ning, Yin Zheng, Tianchen Zhao, Yu Wang, and Huazhong Yang
- Abstract summary: This work proposes a novel Graph-based neural ArchiTecture Scheme, a.k.a. a GATES, to improve the predictor-based neural architecture search.
Gates models the operations as the transformation of the propagating information, which mimics the actual data processing of neural architecture.
- Score: 18.409809742204896
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work proposes a novel Graph-based neural ArchiTecture Encoding Scheme,
a.k.a. GATES, to improve the predictor-based neural architecture search.
Specifically, different from existing graph-based schemes, GATES models the
operations as the transformation of the propagating information, which mimics
the actual data processing of neural architecture. GATES is a more reasonable
modeling of the neural architectures, and can encode architectures from both
the "operation on node" and "operation on edge" cell search spaces
consistently. Experimental results on various search spaces confirm GATES's
effectiveness in improving the performance predictor. Furthermore, equipped
with the improved performance predictor, the sample efficiency of the
predictor-based neural architecture search (NAS) flow is boosted. Codes are
available at https://github.com/walkerning/aw_nas.
Related papers
- FR-NAS: Forward-and-Reverse Graph Predictor for Efficient Neural Architecture Search [10.699485270006601]
We introduce a novel Graph Neural Networks (GNN) predictor for Neural Architecture Search (NAS)
This predictor renders neural architectures into vector representations by combining both the conventional and inverse graph views.
The experimental results showcase a significant improvement in prediction accuracy, with a 3%--16% increase in Kendall-tau correlation.
arXiv Detail & Related papers (2024-04-24T03:22:49Z) - GeNAS: Neural Architecture Search with Better Generalization [14.92869716323226]
Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data.
In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization.
arXiv Detail & Related papers (2023-05-15T12:44:54Z) - NAR-Former: Neural Architecture Representation Learning towards Holistic
Attributes Prediction [37.357949900603295]
We propose a neural architecture representation model that can be used to estimate attributes holistically.
Experiment results show that our proposed framework can be used to predict the latency and accuracy attributes of both cell architectures and whole deep neural networks.
arXiv Detail & Related papers (2022-11-15T10:15:21Z) - Network Graph Based Neural Architecture Search [57.78724765340237]
We search neural network by rewiring the corresponding graph and predict the architecture performance by graph properties.
Because we do not perform machine learning over the entire graph space, the searching process is remarkably efficient.
arXiv Detail & Related papers (2021-12-15T00:12:03Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z) - Neural Architecture Search based on Cartesian Genetic Programming Coding
Method [6.519170476143571]
We propose an evolutionary approach of NAS based on CGP, called CGPNAS, to solve sentence classification task.
The experimental results show that the searched architectures are comparable with the performance of human-designed architectures.
arXiv Detail & Related papers (2021-03-12T09:51:03Z) - DrNAS: Dirichlet Neural Architecture Search [88.56953713817545]
We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution.
With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based generalization.
To alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme.
arXiv Detail & Related papers (2020-06-18T08:23:02Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.