A Novel Evolutionary Algorithm for Hierarchical Neural Architecture
Search
- URL: http://arxiv.org/abs/2107.08484v2
- Date: Thu, 4 May 2023 16:05:24 GMT
- Title: A Novel Evolutionary Algorithm for Hierarchical Neural Architecture
Search
- Authors: Aristeidis Christoforidis, George Kyriakides, Konstantinos Margaritis
- Abstract summary: We propose a novel evolutionary algorithm for neural architecture search, applicable to global search spaces.
The algorithm's architectural representation organizes the topology in multiple hierarchical modules, while the design process exploits this representation, in order to explore the search space.
We apply our method to Fashion-MNIST and NAS-Bench101, achieving accuracies of $93.2%$ and $94.8%$ respectively in a relatively small number of generations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In this work, we propose a novel evolutionary algorithm for neural
architecture search, applicable to global search spaces. The algorithm's
architectural representation organizes the topology in multiple hierarchical
modules, while the design process exploits this representation, in order to
explore the search space. We also employ a curation system, which promotes the
utilization of well performing sub-structures to subsequent generations. We
apply our method to Fashion-MNIST and NAS-Bench101, achieving accuracies of
$93.2\%$ and $94.8\%$ respectively in a relatively small number of generations.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - ECToNAS: Evolutionary Cross-Topology Neural Architecture Search [0.0]
ECToNAS is a cost-efficient evolutionary cross-topology neural architecture search algorithm.
It fuses training and topology optimisation together into one lightweight, resource-friendly process.
arXiv Detail & Related papers (2024-03-08T07:36:46Z) - XC-NAS: A New Cellular Encoding Approach for Neural Architecture Search
of Multi-path Convolutional Neural Networks [0.4915744683251149]
This paper introduces an algorithm capable of evolving novel multi-path CNN architectures of varying depth, width, and complexity for image and text classification tasks.
By using a surrogate model approach, we show that the algorithm can evolve a performant CNN architecture in less than one GPU day.
Experiment results show that the algorithm is highly competitive, defeating several state-of-the-art methods, and is generalisable to both the image and text domains.
arXiv Detail & Related papers (2023-12-12T22:03:11Z) - Construction of Hierarchical Neural Architecture Search Spaces based on
Context-free Grammars [66.05096551112932]
We introduce a unifying search space design framework based on context-free grammars.
By enhancing and using their properties, we effectively enable search over the complete architecture.
We show that our search strategy can be superior to existing Neural Architecture Search approaches.
arXiv Detail & Related papers (2022-11-03T14:23:00Z) - NeuralArTS: Structuring Neural Architecture Search with Type Theory [0.0]
We present a new framework called Neural Architecture Type System (NeuralArTS) that categorizes the infinite set of network operations in a structured type system.
We show how NeuralArTS can be applied to convolutional layers and propose several future directions.
arXiv Detail & Related papers (2021-10-17T03:28:27Z) - Redefining Neural Architecture Search of Heterogeneous Multi-Network
Models by Characterizing Variation Operators and Model Components [71.03032589756434]
We investigate the effect of different variation operators in a complex domain, that of multi-network heterogeneous neural models.
We characterize both the variation operators, according to their effect on the complexity and performance of the model; and the models, relying on diverse metrics which estimate the quality of the different parts composing it.
arXiv Detail & Related papers (2021-06-16T17:12:26Z) - On the Exploitation of Neuroevolutionary Information: Analyzing the Past
for a More Efficient Future [60.99717891994599]
We propose an approach that extracts information from neuroevolutionary runs, and use it to build a metamodel.
We inspect the best structures found during neuroevolutionary searches of generative adversarial networks with varying characteristics.
arXiv Detail & Related papers (2021-05-26T20:55:29Z) - Neighborhood-Aware Neural Architecture Search [43.87465987957761]
We propose a novel neural architecture search (NAS) method to identify flat-minima architectures in the search space.
Our formulation takes the "flatness" of an architecture into account by aggregating the performance over the neighborhood of this architecture.
Based on our formulation, we propose neighborhood-aware random search (NA-RS) and neighborhood-aware differentiable architecture search (NA-DARTS)
arXiv Detail & Related papers (2021-05-13T15:56:52Z) - Evolving Search Space for Neural Architecture Search [70.71153433676024]
We present a Neural Search-space Evolution (NSE) scheme that amplifies the results from the previous effort by maintaining an optimized search space subset.
We achieve 77.3% top-1 retrain accuracy on ImageNet with 333M FLOPs, which yielded a state-of-the-art performance.
When the latency constraint is adopted, our result also performs better than the previous best-performing mobile models with a 77.9% Top-1 retrain accuracy.
arXiv Detail & Related papers (2020-11-22T01:11:19Z) - GOLD-NAS: Gradual, One-Level, Differentiable [100.12492801459105]
We propose a novel algorithm named Gradual One-Level Differentiable Neural Architecture Search (GOLD-NAS)
It introduces a variable resource constraint to one-level optimization so that the weak operators are gradually pruned out from the super-network.
arXiv Detail & Related papers (2020-07-07T10:37:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.