Continuous Ant-Based Neural Topology Search
- URL: http://arxiv.org/abs/2011.10831v1
- Date: Sat, 21 Nov 2020 17:49:44 GMT
- Title: Continuous Ant-Based Neural Topology Search
- Authors: AbdElRahman ElSaid, Joshua Karns, Zimeng Lyu, Alexander Ororbia,
Travis Desell
- Abstract summary: This work introduces a novel, nature-inspired neural architecture search (NAS) algorithm based on ant colony optimization.
The Continuous Ant-based Neural Topology Search (CANTS) is strongly inspired by how ants move in the real world.
- Score: 62.200941836913586
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work introduces a novel, nature-inspired neural architecture search
(NAS) algorithm based on ant colony optimization, Continuous Ant-based Neural
Topology Search (CANTS), which utilizes synthetic ants that move over a
continuous search space based on the density and distribution of pheromones, is
strongly inspired by how ants move in the real world. The paths taken by the
ant agents through the search space are utilized to construct artificial neural
networks (ANNs). This continuous search space allows CANTS to automate the
design of ANNs of any size, removing a key limitation inherent to many current
NAS algorithms that must operate within structures with a size predetermined by
the user. CANTS employs a distributed asynchronous strategy which allows it to
scale to large-scale high performance computing resources, works with a variety
of recurrent memory cell structures, and makes use of a communal weight sharing
strategy to reduce training time. The proposed procedure is evaluated on three
real-world, time series prediction problems in the field of power systems and
compared to two state-of-the-art algorithms. Results show that CANTS is able to
provide improved or competitive results on all of these problems, while also
being easier to use, requiring half the number of user-specified
hyper-parameters.
Related papers
- Colony-Enhanced Recurrent Neural Architecture Search: Collaborative
Ant-Based Optimization [0.0]
This paper introduces Collaborative Ant-based Neural Topology Search (CANTS-N)
In this innovative approach, ant-inspired agents meticulously construct neural network structures, dynamically adapting within a dynamic environment.
CANTS-N has the potential to reshape the landscape of Neural Architecture Search (NAS) and Neural Evolution (NE)
arXiv Detail & Related papers (2024-01-30T22:27:31Z) - Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search [51.18089545051242]
This work expands CANTS by adding a fourth dimension to its search space representing potential neural synaptic weights.
The experiments of this study demonstrate that the BP-Free CANTS algorithm exhibits highly competitive performance compared to both CANTS and ANTS.
arXiv Detail & Related papers (2023-05-11T10:49:07Z) - $\beta$-DARTS: Beta-Decay Regularization for Differentiable Architecture
Search [85.84110365657455]
We propose a simple-but-efficient regularization method, termed as Beta-Decay, to regularize the DARTS-based NAS searching process.
Experimental results on NAS-Bench-201 show that our proposed method can help to stabilize the searching process and makes the searched network more transferable across different datasets.
arXiv Detail & Related papers (2022-03-03T11:47:14Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - GNAS: A Generalized Neural Network Architecture Search Framework [0.0]
In practice, the problems encountered in training NAS (Neural Architecture Search) are not simplex, but a series of combinations of difficulties are often faced.
This paper makes reference and improvement to the previous researches which only solve the single problem of NAS, and combines them into a practical technology flow.
arXiv Detail & Related papers (2021-03-19T06:51:22Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - SpaceNet: Make Free Space For Continual Learning [15.914199054779438]
We propose a novel architectural-based method referred as SpaceNet for class incremental learning scenario.
SpaceNet trains sparse deep neural networks from scratch in an adaptive way that compresses the sparse connections of each task in a compact number of neurons.
Experimental results show the robustness of our proposed method against catastrophic forgetting old tasks and the efficiency of SpaceNet in utilizing the available capacity of the model.
arXiv Detail & Related papers (2020-07-15T11:21:31Z) - VINNAS: Variational Inference-based Neural Network Architecture Search [2.685668802278155]
We present a differentiable variational inference-based NAS method for searching sparse convolutional neural networks.
Our method finds diverse network cells, while showing state-of-the-art accuracy with up to almost 2 times fewer non-zero parameters.
arXiv Detail & Related papers (2020-07-12T21:47:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.