Evolving Efficient Genetic Encoding for Deep Spiking Neural Networks
- URL: http://arxiv.org/abs/2411.06792v1
- Date: Mon, 11 Nov 2024 08:40:52 GMT
- Title: Evolving Efficient Genetic Encoding for Deep Spiking Neural Networks
- Authors: Wenxuan Pan, Feifei Zhao, Bing Han, Haibo Tong, Yi Zeng,
- Abstract summary: Spiking Neural Networks (SNNs) offer a low-energy alternative to Artificial Networks (ANNs)
Existing SNN models, still face high computational costs due to the numerous time steps as well as network depth and scale.
We propose an efficient genetic encoding strategy that dynamic evolves to regulate large-scale deep SNNs at low cost.
- Score: 10.368223587448382
- License:
- Abstract: By exploiting discrete signal processing and simulating brain neuron communication, Spiking Neural Networks (SNNs) offer a low-energy alternative to Artificial Neural Networks (ANNs). However, existing SNN models, still face high computational costs due to the numerous time steps as well as network depth and scale. The tens of billions of neurons and trillions of synapses in the human brain are developed from only 20,000 genes, which inspires us to design an efficient genetic encoding strategy that dynamic evolves to regulate large-scale deep SNNs at low cost. Therefore, we first propose a genetically scaled SNN encoding scheme that incorporates globally shared genetic interactions to indirectly optimize neuronal encoding instead of weight, which obviously brings about reductions in parameters and energy consumption. Then, a spatio-temporal evolutionary framework is designed to optimize the inherently initial wiring rules. Two dynamic regularization operators in the fitness function evolve the neuronal encoding to a suitable distribution and enhance information quality of the genetic interaction respectively, substantially accelerating evolutionary speed and improving efficiency. Experiments show that our approach compresses parameters by approximately 50\% to 80\%, while outperforming models on the same architectures by 0.21\% to 4.38\% on CIFAR-10, CIFAR-100 and ImageNet. In summary, the consistent trends of the proposed genetically encoded spatio-temporal evolution across different datasets and architectures highlight its significant enhancements in terms of efficiency, broad scalability and robustness, demonstrating the advantages of the brain-inspired evolutionary genetic coding for SNN optimization.
Related papers
- NeuroLGP-SM: Scalable Surrogate-Assisted Neuroevolution for Deep Neural Networks [0.0]
Evolutionary algorithms play a crucial role in the architectural configuration and training of Artificial Deep Neural Networks (DNNs)
In this work, we use phenotypic distance vectors, outputted from DNNs, alongside Kriging Partial Least Squares (KPLS) to make them suitable for search.
Our proposed approach, named Neuro-Linear Genetic Programming surrogate model (NeuroLGP-SM), efficiently and accurately estimates DNN fitness without the need for complete evaluations.
arXiv Detail & Related papers (2024-04-12T19:15:38Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - Brain-inspired Evolutionary Architectures for Spiking Neural Networks [6.607406750195899]
We explore efficient architectural optimization for Spiking Neural Networks (SNNs)
This paper evolves SNNs architecture by incorporating brain-inspired local modular structure and global cross- module connectivity.
We introduce an efficient multi-objective evolutionary algorithm based on a few-shot performance predictor, endowing SNNs with high performance, efficiency and low energy consumption.
arXiv Detail & Related papers (2023-09-11T06:39:11Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Adaptive Sparse Structure Development with Pruning and Regeneration for
Spiking Neural Networks [6.760855795263126]
Spiking Neural Networks (SNNs) have the natural advantage of drawing the sparse structural plasticity of brain development to alleviate the energy problems of deep neural networks.
This paper proposed a novel method for the adaptive structural development of SNN, introducing dendritic spine plasticity-based synaptic constraint, neuronal pruning and synaptic regeneration.
arXiv Detail & Related papers (2022-11-22T12:23:30Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Optimizing Deep Neural Networks through Neuroevolution with Stochastic
Gradient Descent [18.70093247050813]
gradient descent (SGD) is dominant in training a deep neural network (DNN)
Neuroevolution is more in line with an evolutionary process and provides some key capabilities that are often unavailable in SGD.
A hierarchical cluster-based suppression algorithm is also developed to overcome similar weight updates among individuals for improving population diversity.
arXiv Detail & Related papers (2020-12-21T08:54:14Z) - Optimizing Memory Placement using Evolutionary Graph Reinforcement
Learning [56.83172249278467]
We introduce Evolutionary Graph Reinforcement Learning (EGRL), a method designed for large search spaces.
We train and validate our approach directly on the Intel NNP-I chip for inference.
We additionally achieve 28-78% speed-up compared to the native NNP-I compiler on all three workloads.
arXiv Detail & Related papers (2020-07-14T18:50:12Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Genetic Algorithmic Parameter Optimisation of a Recurrent Spiking Neural
Network Model [0.6767885381740951]
We use genetic algorithm (GA) to search for optimal parameters in recurrent spiking neural networks (SNNs)
We consider a cortical column based SNN comprising 1000 Izhikevich spiking neurons for computational efficiency and biologically realism.
We show that the GA optimal population size was within 16-20 while the crossover rate that returned the best fitness value was 0.95.
arXiv Detail & Related papers (2020-03-30T22:44:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.