Ecological Neural Architecture Search
- URL: http://arxiv.org/abs/2503.10908v1
- Date: Thu, 13 Mar 2025 21:40:25 GMT
- Title: Ecological Neural Architecture Search
- Authors: Benjamin David Winter, William J. Teahan,
- Abstract summary: This paper introduces Neuvo Ecological Neural Architecture Search (ENAS)<n>ENAS incorporates evolutionary parameters directly into the candidate solutions' phenotypes, allowing them to evolve dynamically alongside architecture specifications.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: When employing an evolutionary algorithm to optimize a neural networks architecture, developers face the added challenge of tuning the evolutionary algorithm's own hyperparameters - population size, mutation rate, cloning rate, and number of generations. This paper introduces Neuvo Ecological Neural Architecture Search (ENAS), a novel method that incorporates these evolutionary parameters directly into the candidate solutions' phenotypes, allowing them to evolve dynamically alongside architecture specifications. Experimental results across four binary classification datasets demonstrate that ENAS not only eliminates manual tuning of evolutionary parameters but also outperforms competitor NAS methodologies in convergence speed (reducing computational time by 18.3%) and accuracy (improving classification performance in 3 out of 4 datasets). By enabling "greedy individuals" to optimize resource allocation based on fitness, ENAS provides an efficient, self-regulating approach to neural architecture search.
Related papers
- Evolution Meets Diffusion: Efficient Neural Architecture Generation [1.8284471682448833]
Neural Architecture Search (NAS) has gained widespread attention for its transformative potential in deep learning model design.
We propose Evolutionary Diffusion-based Neural Architecture Generation (EDNAG), a novel approach that achieves efficient and training-free architecture generation.
EDNAG achieves state-of-the-art (SOTA) performance in architecture optimization, with an improvement of up to 10.45%.
It eliminates the need for time-consuming training and boosts inference speed by an average of 50 times, showcasing its exceptional efficiency and effectiveness.
arXiv Detail & Related papers (2025-04-24T03:09:04Z) - Evolutionary Neural Architecture Search for 3D Point Cloud Analysis [0.6378824981027466]
This paper presents Success-History-based Self-adaptive Differential Evolution with a Joint Point Interaction Dimension Search (SHSADE-PIDS)
SHSADE-PIDS is an evolutionary NAS framework that encodes discrete deep neural network architectures to continuous spaces and performs searches in the continuous spaces for efficient point cloud neural architectures.
Comprehensive experiments on challenging 3D segmentation and classification benchmarks demonstrate SHSADE-PIDS's capabilities.
arXiv Detail & Related papers (2024-08-10T13:41:18Z) - A First Step Towards Runtime Analysis of Evolutionary Neural Architecture Search [24.056523078277053]
This work goes preliminary steps toward the mathematical runtime analysis of ENAS.
We define a binary classification problem $textscUNIFORM$, and formulate an explicit fitness function to represent the relationship between neural architecture and classification accuracy.
The theoretical results show that the local and global mutations achieve nearly the same performance on $textscUNIFORM$.
arXiv Detail & Related papers (2024-01-22T06:29:22Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - HKNAS: Classification of Hyperspectral Imagery Based on Hyper Kernel
Neural Architecture Search [104.45426861115972]
We propose to directly generate structural parameters by utilizing the specifically designed hyper kernels.
We obtain three kinds of networks to separately conduct pixel-level or image-level classifications with 1-D or 3-D convolutions.
A series of experiments on six public datasets demonstrate that the proposed methods achieve state-of-the-art results.
arXiv Detail & Related papers (2023-04-23T17:27:40Z) - On the Exploitation of Neuroevolutionary Information: Analyzing the Past
for a More Efficient Future [60.99717891994599]
We propose an approach that extracts information from neuroevolutionary runs, and use it to build a metamodel.
We inspect the best structures found during neuroevolutionary searches of generative adversarial networks with varying characteristics.
arXiv Detail & Related papers (2021-05-26T20:55:29Z) - Neural Architecture Search based on Cartesian Genetic Programming Coding
Method [6.519170476143571]
We propose an evolutionary approach of NAS based on CGP, called CGPNAS, to solve sentence classification task.
The experimental results show that the searched architectures are comparable with the performance of human-designed architectures.
arXiv Detail & Related papers (2021-03-12T09:51:03Z) - Differentiable Neural Architecture Learning for Efficient Neural Network
Design [31.23038136038325]
We introduce a novel emph architecture parameterisation based on scaled sigmoid function.
We then propose a general emphiable Neural Architecture Learning (DNAL) method to optimize the neural architecture without the need to evaluate candidate neural networks.
arXiv Detail & Related papers (2021-03-03T02:03:08Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - DrNAS: Dirichlet Neural Architecture Search [88.56953713817545]
We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution.
With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based generalization.
To alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme.
arXiv Detail & Related papers (2020-06-18T08:23:02Z) - FBNetV3: Joint Architecture-Recipe Search using Predictor Pretraining [65.39532971991778]
We present an accuracy predictor that scores architecture and training recipes jointly, guiding both sample selection and ranking.
We run fast evolutionary searches in just CPU minutes to generate architecture-recipe pairs for a variety of resource constraints.
FBNetV3 makes up a family of state-of-the-art compact neural networks that outperform both automatically and manually-designed competitors.
arXiv Detail & Related papers (2020-06-03T05:20:21Z) - Binarizing MobileNet via Evolution-based Searching [66.94247681870125]
We propose a use of evolutionary search to facilitate the construction and training scheme when binarizing MobileNet.
Inspired by one-shot architecture search frameworks, we manipulate the idea of group convolution to design efficient 1-Bit Convolutional Neural Networks (CNNs)
Our objective is to come up with a tiny yet efficient binary neural architecture by exploring the best candidates of the group convolution.
arXiv Detail & Related papers (2020-05-13T13:25:51Z) - Sampled Training and Node Inheritance for Fast Evolutionary Neural
Architecture Search [22.483917379706725]
evolutionary neural architecture search (ENAS) has received increasing attention due to the attractive global optimization capability of evolutionary algorithms.
This paper proposes a new framework for fast ENAS based on directed acyclic graph, in which parents are randomly sampled and trained on each mini-batch of training data.
We evaluate the proposed algorithm on the widely used datasets, in comparison with 26 state-of-the-art peer algorithms.
arXiv Detail & Related papers (2020-03-07T12:33:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.