Multi-scale Evolutionary Neural Architecture Search for Deep Spiking
Neural Networks
- URL: http://arxiv.org/abs/2304.10749v5
- Date: Tue, 7 Nov 2023 07:54:54 GMT
- Title: Multi-scale Evolutionary Neural Architecture Search for Deep Spiking
Neural Networks
- Authors: Wenxuan Pan, Feifei Zhao, Guobin Shen, Yi Zeng
- Abstract summary: We propose a Multi-Scale Evolutionary Neural Architecture Search (MSE-NAS) for Spiking Neural Networks (SNNs)
MSE-NAS evolves individual neuron operation, self-organized integration of multiple circuit motifs, and global connectivity across motifs through a brain-inspired indirect evaluation function, Representational Dissimilarity Matrices (RDMs)
The proposed algorithm achieves state-of-the-art (SOTA) performance with shorter simulation steps on static datasets and neuromorphic datasets.
- Score: 7.271032282434803
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Spiking Neural Networks (SNNs) have received considerable attention not only
for their superiority in energy efficiency with discrete signal processing but
also for their natural suitability to integrate multi-scale biological
plasticity. However, most SNNs directly adopt the structure of the
well-established Deep Neural Networks (DNNs), and rarely automatically design
Neural Architecture Search (NAS) for SNNs. The neural motifs topology, modular
regional structure and global cross-brain region connection of the human brain
are the product of natural evolution and can serve as a perfect reference for
designing brain-inspired SNN architecture. In this paper, we propose a
Multi-Scale Evolutionary Neural Architecture Search (MSE-NAS) for SNN,
simultaneously considering micro-, meso- and macro-scale brain topologies as
the evolutionary search space. MSE-NAS evolves individual neuron operation,
self-organized integration of multiple circuit motifs, and global connectivity
across motifs through a brain-inspired indirect evaluation function,
Representational Dissimilarity Matrices (RDMs). This training-free fitness
function could greatly reduce computational consumption and NAS's time, and its
task-independent property enables the searched SNNs to exhibit excellent
transferability on multiple datasets. Furthermore, MSE-NAS show robustness
against the training method and noise. Extensive experiments demonstrate that
the proposed algorithm achieves state-of-the-art (SOTA) performance with
shorter simulation steps on static datasets (CIFAR10, CIFAR100) and
neuromorphic datasets (CIFAR10-DVS and DVS128-Gesture). The thorough analysis
also illustrates the significant performance improvement and consistent
bio-interpretability deriving from the topological evolution at different
scales and the RDMs fitness function.
Related papers
- Spatial-Temporal Search for Spiking Neural Networks [32.937536365872745]
Spiking Neural Networks (SNNs) are considered as a potential candidate for the next generation of artificial intelligence.
We propose a differentiable approach to optimize SNN on both spatial and temporal dimensions.
Our methods achieve comparable classification performance of CIFAR10/100 and ImageNet with accuracies of 96.43%, 78.96%, and 70.21%, respectively.
arXiv Detail & Related papers (2024-10-24T09:32:51Z) - Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Direct Training High-Performance Deep Spiking Neural Networks: A Review of Theories and Methods [33.377770671553336]
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks (ANNs)
In this paper, we provide a new perspective to summarize the theories and methods for training deep SNNs with high performance.
arXiv Detail & Related papers (2024-05-06T09:58:54Z) - Brain-inspired Evolutionary Architectures for Spiking Neural Networks [6.607406750195899]
We explore efficient architectural optimization for Spiking Neural Networks (SNNs)
This paper evolves SNNs architecture by incorporating brain-inspired local modular structure and global cross- module connectivity.
We introduce an efficient multi-objective evolutionary algorithm based on a few-shot performance predictor, endowing SNNs with high performance, efficiency and low energy consumption.
arXiv Detail & Related papers (2023-09-11T06:39:11Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - A Synapse-Threshold Synergistic Learning Approach for Spiking Neural
Networks [1.8556712517882232]
Spiking neural networks (SNNs) have demonstrated excellent capabilities in various intelligent scenarios.
In this study, we develop a novel synergistic learning approach that involves simultaneously training synaptic weights and spike thresholds in SNNs.
arXiv Detail & Related papers (2022-06-10T06:41:36Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - Finite Meta-Dynamic Neurons in Spiking Neural Networks for
Spatio-temporal Learning [13.037452551907657]
Spiking Neural Networks (SNNs) have incorporated more biologically-plausible structures and learning principles.
We propose Meta-Dynamic Neurons (MDNs) to improve SNNs for a better network generalization during-temporal learning.
The MDNs generated from a spatial (MNIST) and a temporal (TIts) datasets first and then extended to various other different-temporal tasks.
arXiv Detail & Related papers (2020-10-07T03:49:28Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.