Network Graph Based Neural Architecture Search
- URL: http://arxiv.org/abs/2112.07805v1
- Date: Wed, 15 Dec 2021 00:12:03 GMT
- Title: Network Graph Based Neural Architecture Search
- Authors: Zhenhan Huang, Chunheng Jiang, Pin-Yu Chen and Jianxi Gao
- Abstract summary: We search neural network by rewiring the corresponding graph and predict the architecture performance by graph properties.
Because we do not perform machine learning over the entire graph space, the searching process is remarkably efficient.
- Score: 57.78724765340237
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural architecture search enables automation of architecture design. Despite
its success, it is computationally costly and does not provide an insight on
how to design a desirable architecture. Here we propose a new way of searching
neural network where we search neural architecture by rewiring the
corresponding graph and predict the architecture performance by graph
properties. Because we do not perform machine learning over the entire graph
space and use predicted architecture performance to search architecture, the
searching process is remarkably efficient. We find graph based search can give
a reasonably good prediction of desirable architecture. In addition, we find
graph properties that are effective to predict architecture performance. Our
work proposes a new way of searching neural architecture and provides insights
on neural architecture design.
Related papers
- Knowledge-aware Evolutionary Graph Neural Architecture Search [49.13787973318586]
Graph neural architecture search (GNAS) can customize high-performance graph neural network architectures for specific graph tasks or datasets.
Existing GNAS methods begin searching for architectures from a zero-knowledge state, ignoring the prior knowledge that may improve the search efficiency.
This study proposes exploiting such prior knowledge to accelerate the multi-objective evolutionary search on a new graph dataset.
arXiv Detail & Related papers (2024-11-26T11:32:45Z) - EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Unsupervised Graph Neural Architecture Search with Disentangled
Self-supervision [51.88848982611515]
Unsupervised graph neural architecture search remains unexplored in the literature.
We propose a novel Disentangled Self-supervised Graph Neural Architecture Search model.
Our model is able to achieve state-of-the-art performance against several baseline methods in an unsupervised manner.
arXiv Detail & Related papers (2024-03-08T05:23:55Z) - Neural Architecture Retrieval [27.063268631346713]
We define a new problem Neural Architecture Retrieval which retrieves a set of existing neural architectures with similar designs to the query neural architecture.
Existing graph pre-training strategies cannot address the computational graph in neural architectures due to the graph size and motifs.
We introduce multi-level contrastive learning to achieve accurate graph representation learning.
arXiv Detail & Related papers (2023-07-16T01:56:41Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Interpretable Neural Architecture Search via Bayesian Optimisation with
Weisfeiler-Lehman Kernels [17.945881805452288]
Current neural architecture search (NAS) strategies focus on finding a single, good, architecture.
We propose a Bayesian optimisation approach for NAS that combines the Weisfeiler-Lehman graph kernel with a Gaussian process surrogate.
Our method affords interpretability by discovering useful network features and their corresponding impact on the network performance.
arXiv Detail & Related papers (2020-06-13T04:10:34Z) - Does Unsupervised Architecture Representation Learning Help Neural
Architecture Search? [22.63641173256389]
Existing Neural Architecture Search (NAS) methods either encode neural architectures using discrete encodings that do not scale well, or adopt supervised learning-based methods to jointly learn architecture representations and optimize architecture search on such representations which incurs search bias.
We observe that the structural properties of neural architectures are hard to preserve in the latent space if architecture representation learning and search are coupled, resulting in less effective search performance.
arXiv Detail & Related papers (2020-06-12T04:15:34Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.