HiveNAS: Neural Architecture Search using Artificial Bee Colony
Optimization
- URL: http://arxiv.org/abs/2211.10250v2
- Date: Thu, 15 Jun 2023 17:02:09 GMT
- Title: HiveNAS: Neural Architecture Search using Artificial Bee Colony
Optimization
- Authors: Mohamed Shahawy and Elhadj Benkhelifa
- Abstract summary: In this study, we evaluate the viability of Artificial Bee Colony optimization for Neural Architecture Search.
Our proposed framework, HiveNAS, outperforms existing state-of-the-art Swarm Intelligence-based NAS frameworks in a fraction of the time.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The traditional Neural Network-development process requires substantial
expert knowledge and relies heavily on intuition and trial-and-error. Neural
Architecture Search (NAS) frameworks were introduced to robustly search for
network topologies, as well as facilitate the automated development of Neural
Networks. While some optimization approaches -- such as Genetic Algorithms --
have been extensively explored in the NAS context, other Metaheuristic
Optimization algorithms have not yet been investigated. In this study, we
evaluate the viability of Artificial Bee Colony optimization for Neural
Architecture Search. Our proposed framework, HiveNAS, outperforms existing
state-of-the-art Swarm Intelligence-based NAS frameworks in a fraction of the
time.
Related papers
- Colony-Enhanced Recurrent Neural Architecture Search: Collaborative
Ant-Based Optimization [0.0]
This paper introduces Collaborative Ant-based Neural Topology Search (CANTS-N)
In this innovative approach, ant-inspired agents meticulously construct neural network structures, dynamically adapting within a dynamic environment.
CANTS-N has the potential to reshape the landscape of Neural Architecture Search (NAS) and Neural Evolution (NE)
arXiv Detail & Related papers (2024-01-30T22:27:31Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Neural Architecture Search for Speech Emotion Recognition [72.1966266171951]
We propose to apply neural architecture search (NAS) techniques to automatically configure the SER models.
We show that NAS can improve SER performance (54.89% to 56.28%) while maintaining model parameter sizes.
arXiv Detail & Related papers (2022-03-31T10:16:10Z) - Accelerating Neural Architecture Exploration Across Modalities Using
Genetic Algorithms [5.620334754517149]
We show how genetic algorithms can be paired with lightly trained objective predictors in an iterative cycle to accelerate multi-objective architectural exploration.
NAS research efforts have centered around computer vision tasks and only recently have other modalities, such as the rapidly growing field of natural language processing, been investigated in depth.
arXiv Detail & Related papers (2022-02-25T20:01:36Z) - NeuralArTS: Structuring Neural Architecture Search with Type Theory [0.0]
We present a new framework called Neural Architecture Type System (NeuralArTS) that categorizes the infinite set of network operations in a structured type system.
We show how NeuralArTS can be applied to convolutional layers and propose several future directions.
arXiv Detail & Related papers (2021-10-17T03:28:27Z) - Going Beyond Neural Architecture Search with Sampling-based Neural
Ensemble Search [31.059040393415003]
We present two novel sampling algorithms under our Neural Ensemble Search via Sampling (NESS) framework.
Our NESS algorithms are shown to be able to achieve improved performance in both classification and adversarial defense tasks.
arXiv Detail & Related papers (2021-09-06T15:18:37Z) - On the Exploitation of Neuroevolutionary Information: Analyzing the Past
for a More Efficient Future [60.99717891994599]
We propose an approach that extracts information from neuroevolutionary runs, and use it to build a metamodel.
We inspect the best structures found during neuroevolutionary searches of generative adversarial networks with varying characteristics.
arXiv Detail & Related papers (2021-05-26T20:55:29Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - Neural Architecture Search as Sparse Supernet [78.09905626281046]
This paper aims at enlarging the problem of Neural Architecture Search (NAS) from Single-Path and Multi-Path Search to automated Mixed-Path Search.
We model the NAS problem as a sparse supernet using a new continuous architecture representation with a mixture of sparsity constraints.
The sparse supernet enables us to automatically achieve sparsely-mixed paths upon a compact set of nodes.
arXiv Detail & Related papers (2020-07-31T14:51:52Z) - An Introduction to Neural Architecture Search for Convolutional Networks [0.0]
Neural Architecture Search (NAS) is a research field concerned with utilizing optimization algorithms to design optimal neural network architectures.
We provide an introduction to the basic concepts of NAS for convolutional networks, along with the major advances in search spaces, algorithms and evaluation techniques.
arXiv Detail & Related papers (2020-05-22T09:33:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.