Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search
- URL: http://arxiv.org/abs/2305.06715v3
- Date: Tue, 30 Jan 2024 21:53:35 GMT
- Title: Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search
- Authors: AbdElRahman ElSaid and Karl Ricanek and Zeming Lyu and Alexander
Ororbia and Travis Desell
- Abstract summary: This work expands CANTS by adding a fourth dimension to its search space representing potential neural synaptic weights.
The experiments of this study demonstrate that the BP-Free CANTS algorithm exhibits highly competitive performance compared to both CANTS and ANTS.
- Score: 51.18089545051242
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Continuous Ant-based Topology Search (CANTS) is a previously introduced novel
nature-inspired neural architecture search (NAS) algorithm that is based on ant
colony optimization (ACO). CANTS utilizes a continuous search space to
indirectly-encode a neural architecture search space. Synthetic ant agents
explore CANTS' continuous search space based on the density and distribution of
pheromones, strongly inspired by how ants move in the real world. This
continuous search space allows CANTS to automate the design of artificial
neural networks (ANNs) of any size, removing a key limitation inherent to many
current NAS algorithms that must operate within structures of a size that is
predetermined by the user. This work expands CANTS by adding a fourth dimension
to its search space representing potential neural synaptic weights. Adding this
extra dimension allows CANTS agents to optimize both the architecture as well
as the weights of an ANN without applying backpropagation (BP), which leads to
a significant reduction in the time consumed in the optimization process: at
least an average of 96% less time consumption with very competitive
optimization performance, if not better. The experiments of this study - using
real-world data - demonstrate that the BP-Free CANTS algorithm exhibits highly
competitive performance compared to both CANTS and ANTS while requiring
significantly less operation time.
Related papers
- Parallel Hyperparameter Optimization Of Spiking Neural Network [0.5371337604556311]
Spiking Neural Networks (SNNs) are based on a more biologically inspired approach than usual artificial neural networks.
We tackle the signal loss issue of SNNs to what we call silent networks.
By defining an early stopping criterion, we were able to instantiate larger and more flexible search spaces.
arXiv Detail & Related papers (2024-03-01T11:11:59Z) - Colony-Enhanced Recurrent Neural Architecture Search: Collaborative
Ant-Based Optimization [0.0]
This paper introduces Collaborative Ant-based Neural Topology Search (CANTS-N)
In this innovative approach, ant-inspired agents meticulously construct neural network structures, dynamically adapting within a dynamic environment.
CANTS-N has the potential to reshape the landscape of Neural Architecture Search (NAS) and Neural Evolution (NE)
arXiv Detail & Related papers (2024-01-30T22:27:31Z) - LitE-SNN: Designing Lightweight and Efficient Spiking Neural Network through Spatial-Temporal Compressive Network Search and Joint Optimization [48.41286573672824]
Spiking Neural Networks (SNNs) mimic the information-processing mechanisms of the human brain and are highly energy-efficient.
We propose a new approach named LitE-SNN that incorporates both spatial and temporal compression into the automated network design process.
arXiv Detail & Related papers (2024-01-26T05:23:11Z) - Poisoning the Search Space in Neural Architecture Search [0.0]
We evaluate the robustness of one such algorithm known as Efficient NAS against data poisoning attacks on the original search space.
Our results provide insights into the challenges to surmount in using NAS for more adversarially robust architecture search.
arXiv Detail & Related papers (2021-06-28T05:45:57Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - AutoSpace: Neural Architecture Search with Less Human Interference [84.42680793945007]
Current neural architecture search (NAS) algorithms still require expert knowledge and effort to design a search space for network construction.
We propose a novel differentiable evolutionary framework named AutoSpace, which evolves the search space to an optimal one.
With the learned search space, the performance of recent NAS algorithms can be improved significantly compared with using previously manually designed spaces.
arXiv Detail & Related papers (2021-03-22T13:28:56Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - Continuous Ant-Based Neural Topology Search [62.200941836913586]
This work introduces a novel, nature-inspired neural architecture search (NAS) algorithm based on ant colony optimization.
The Continuous Ant-based Neural Topology Search (CANTS) is strongly inspired by how ants move in the real world.
arXiv Detail & Related papers (2020-11-21T17:49:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.