A Self-adaptive Neuroevolution Approach to Constructing Deep Neural
Network Architectures Across Different Types
- URL: http://arxiv.org/abs/2211.14753v2
- Date: Wed, 30 Nov 2022 12:47:35 GMT
- Title: A Self-adaptive Neuroevolution Approach to Constructing Deep Neural
Network Architectures Across Different Types
- Authors: Zhenhao Shuai, Hongbo Liu, Zhaolin Wan, Wei-Jie Yu, Jun Zhang
- Abstract summary: We propose a self-adaptive neuroevolution (SANE) approach to automatically construct various lightweight Deep Neural Network (DNN) architectures for different tasks.
One of the key settings in SANE is the search space defined by cells and organs self-adapted to different DNN types.
SANE is able to self-adaptively adjust evolution exploration and exploitation to improve search efficiency.
- Score: 5.429458930060452
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuroevolution has greatly promoted Deep Neural Network (DNN) architecture
design and its applications, while there is a lack of methods available across
different DNN types concerning both their scale and performance. In this study,
we propose a self-adaptive neuroevolution (SANE) approach to automatically
construct various lightweight DNN architectures for different tasks. One of the
key settings in SANE is the search space defined by cells and organs
self-adapted to different DNN types. Based on this search space, a constructive
evolution strategy with uniform evolution settings and operations is designed
to grow DNN architectures gradually. SANE is able to self-adaptively adjust
evolution exploration and exploitation to improve search efficiency. Moreover,
a speciation scheme is developed to protect evolution from early convergence by
restricting selection competition within species. To evaluate SANE, we carry
out neuroevolution experiments to generate different DNN architectures
including convolutional neural network, generative adversarial network and long
short-term memory. The results illustrate that the obtained DNN architectures
could have smaller scale with similar performance compared to existing DNN
architectures. Our proposed SANE provides an efficient approach to
self-adaptively search DNN architectures across different types.
Related papers
- Spatial-Temporal Search for Spiking Neural Networks [32.937536365872745]
Spiking Neural Networks (SNNs) are considered as a potential candidate for the next generation of artificial intelligence.
We propose a differentiable approach to optimize SNN on both spatial and temporal dimensions.
Our methods achieve comparable classification performance of CIFAR10/100 and ImageNet with accuracies of 96.43%, 78.96%, and 70.21%, respectively.
arXiv Detail & Related papers (2024-10-24T09:32:51Z) - AD-NEv++ : The multi-architecture neuroevolution-based multivariate anomaly detection framework [0.794682109939797]
Anomaly detection tools and methods enable key analytical capabilities in modern cyberphysical and sensor-based systems.
We propose AD-NEv++, a three-stage neuroevolution-based method that synergically combines subspace evolution, model evolution, and fine-tuning.
We show that AD-NEv++ can improve and outperform the state-of-the-art GNN (Graph Neural Networks) model architecture in all anomaly detection benchmarks.
arXiv Detail & Related papers (2024-03-25T08:40:58Z) - Unveiling the Unseen: Identifiable Clusters in Trained Depthwise
Convolutional Kernels [56.69755544814834]
Recent advances in depthwise-separable convolutional neural networks (DS-CNNs) have led to novel architectures.
This paper reveals another striking property of DS-CNN architectures: discernible and explainable patterns emerge in their trained depthwise convolutional kernels in all layers.
arXiv Detail & Related papers (2024-01-25T19:05:53Z) - Brain-inspired Evolutionary Architectures for Spiking Neural Networks [6.607406750195899]
We explore efficient architectural optimization for Spiking Neural Networks (SNNs)
This paper evolves SNNs architecture by incorporating brain-inspired local modular structure and global cross- module connectivity.
We introduce an efficient multi-objective evolutionary algorithm based on a few-shot performance predictor, endowing SNNs with high performance, efficiency and low energy consumption.
arXiv Detail & Related papers (2023-09-11T06:39:11Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Multi-scale Evolutionary Neural Architecture Search for Deep Spiking
Neural Networks [7.271032282434803]
We propose a Multi-Scale Evolutionary Neural Architecture Search (MSE-NAS) for Spiking Neural Networks (SNNs)
MSE-NAS evolves individual neuron operation, self-organized integration of multiple circuit motifs, and global connectivity across motifs through a brain-inspired indirect evaluation function, Representational Dissimilarity Matrices (RDMs)
The proposed algorithm achieves state-of-the-art (SOTA) performance with shorter simulation steps on static datasets and neuromorphic datasets.
arXiv Detail & Related papers (2023-04-21T05:36:37Z) - On the Intrinsic Structures of Spiking Neural Networks [66.57589494713515]
Recent years have emerged a surge of interest in SNNs owing to their remarkable potential to handle time-dependent and event-driven data.
There has been a dearth of comprehensive studies examining the impact of intrinsic structures within spiking computations.
This work delves deep into the intrinsic structures of SNNs, by elucidating their influence on the expressivity of SNNs.
arXiv Detail & Related papers (2022-06-21T09:42:30Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Evolutionary Architecture Search for Graph Neural Networks [23.691915813153496]
We propose a novel AutoML framework through the evolution of individual models in a large Graph Neural Networks (GNN) architecture space.
To the best of our knowledge, this is the first work to introduce and evaluate evolutionary architecture search for GNN models.
arXiv Detail & Related papers (2020-09-21T22:11:53Z) - Neural Architecture Search For LF-MMI Trained Time Delay Neural Networks [61.76338096980383]
A range of neural architecture search (NAS) techniques are used to automatically learn two types of hyper- parameters of state-of-the-art factored time delay neural networks (TDNNs)
These include the DARTS method integrating architecture selection with lattice-free MMI (LF-MMI) TDNN training.
Experiments conducted on a 300-hour Switchboard corpus suggest the auto-configured systems consistently outperform the baseline LF-MMI TDNN systems.
arXiv Detail & Related papers (2020-07-17T08:32:11Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.