XC-NAS: A New Cellular Encoding Approach for Neural Architecture Search
of Multi-path Convolutional Neural Networks
- URL: http://arxiv.org/abs/2312.07760v1
- Date: Tue, 12 Dec 2023 22:03:11 GMT
- Title: XC-NAS: A New Cellular Encoding Approach for Neural Architecture Search
of Multi-path Convolutional Neural Networks
- Authors: Trevor Londt, Xiaoying Gao, Peter Andreae, Yi Mei
- Abstract summary: This paper introduces an algorithm capable of evolving novel multi-path CNN architectures of varying depth, width, and complexity for image and text classification tasks.
By using a surrogate model approach, we show that the algorithm can evolve a performant CNN architecture in less than one GPU day.
Experiment results show that the algorithm is highly competitive, defeating several state-of-the-art methods, and is generalisable to both the image and text domains.
- Score: 0.4915744683251149
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional Neural Networks (CNNs) continue to achieve great success in
classification tasks as innovative techniques and complex multi-path
architecture topologies are introduced. Neural Architecture Search (NAS) aims
to automate the design of these complex architectures, reducing the need for
costly manual design work by human experts. Cellular Encoding (CE) is an
evolutionary computation technique which excels in constructing novel
multi-path topologies of varying complexity and has recently been applied with
NAS to evolve CNN architectures for various classification tasks. However,
existing CE approaches have severe limitations. They are restricted to only one
domain, only partially implement the theme of CE, or only focus on the
micro-architecture search space. This paper introduces a new CE representation
and algorithm capable of evolving novel multi-path CNN architectures of varying
depth, width, and complexity for image and text classification tasks. The
algorithm explicitly focuses on the macro-architecture search space.
Furthermore, by using a surrogate model approach, we show that the algorithm
can evolve a performant CNN architecture in less than one GPU day, thereby
allowing a sufficient number of experiment runs to be conducted to achieve
scientific robustness. Experiment results show that the approach is highly
competitive, defeating several state-of-the-art methods, and is generalisable
to both the image and text domains.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - ECToNAS: Evolutionary Cross-Topology Neural Architecture Search [0.0]
ECToNAS is a cost-efficient evolutionary cross-topology neural architecture search algorithm.
It fuses training and topology optimisation together into one lightweight, resource-friendly process.
arXiv Detail & Related papers (2024-03-08T07:36:46Z) - OFA$^2$: A Multi-Objective Perspective for the Once-for-All Neural
Architecture Search [79.36688444492405]
Once-for-All (OFA) is a Neural Architecture Search (NAS) framework designed to address the problem of searching efficient architectures for devices with different resources constraints.
We aim to give one step further in the search for efficiency by explicitly conceiving the search stage as a multi-objective optimization problem.
arXiv Detail & Related papers (2023-03-23T21:30:29Z) - POPNASv3: a Pareto-Optimal Neural Architecture Search Solution for Image
and Time Series Classification [8.190723030003804]
This article presents the third version of a sequential model-based NAS algorithm targeting different hardware environments and multiple classification tasks.
Our method is able to find competitive architectures within large search spaces, while keeping a flexible structure and data processing pipeline to adapt to different tasks.
The experiments performed on images and time series classification datasets provide evidence that POPNASv3 can explore a large set of assorted operators and converge to optimal architectures suited for the type of data provided under different scenarios.
arXiv Detail & Related papers (2022-12-13T17:14:14Z) - Surrogate-assisted Multi-objective Neural Architecture Search for
Real-time Semantic Segmentation [11.866947846619064]
neural architecture search (NAS) has emerged as a promising avenue toward automating the design of architectures.
We propose a surrogate-assisted multi-objective method to address the challenges of applying NAS to semantic segmentation.
Our method can identify architectures significantly outperforming existing state-of-the-art architectures designed both manually by human experts and automatically by other NAS methods.
arXiv Detail & Related papers (2022-08-14T10:18:51Z) - Learning Interpretable Models Through Multi-Objective Neural
Architecture Search [0.9990687944474739]
We propose a framework to optimize for both task performance and "introspectability," a surrogate metric for aspects of interpretability.
We demonstrate that jointly optimizing for task error and introspectability leads to more disentangled and debuggable architectures that perform within error.
arXiv Detail & Related papers (2021-12-16T05:50:55Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - Automated Search for Resource-Efficient Branched Multi-Task Networks [81.48051635183916]
We propose a principled approach, rooted in differentiable neural architecture search, to automatically define branching structures in a multi-task neural network.
We show that our approach consistently finds high-performing branching structures within limited resource budgets.
arXiv Detail & Related papers (2020-08-24T09:49:19Z) - NAS-Count: Counting-by-Density with Neural Architecture Search [74.92941571724525]
We automate the design of counting models with Neural Architecture Search (NAS)
We introduce an end-to-end searched encoder-decoder architecture, Automatic Multi-Scale Network (AMSNet)
arXiv Detail & Related papers (2020-02-29T09:18:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.