Evolving Efficient Genetic Encoding for Deep Spiking Neural Networks
- URL: http://arxiv.org/abs/2411.06792v1
- Date: Mon, 11 Nov 2024 08:40:52 GMT
- Title: Evolving Efficient Genetic Encoding for Deep Spiking Neural Networks
- Authors: Wenxuan Pan, Feifei Zhao, Bing Han, Haibo Tong, Yi Zeng,
- Abstract summary: Spiking Neural Networks (SNNs) offer a low-energy alternative to Artificial Networks (ANNs)
Existing SNN models, still face high computational costs due to the numerous time steps as well as network depth and scale.
We propose an efficient genetic encoding strategy that dynamic evolves to regulate large-scale deep SNNs at low cost.
- Score: 10.368223587448382
- License:
- Abstract: By exploiting discrete signal processing and simulating brain neuron communication, Spiking Neural Networks (SNNs) offer a low-energy alternative to Artificial Neural Networks (ANNs). However, existing SNN models, still face high computational costs due to the numerous time steps as well as network depth and scale. The tens of billions of neurons and trillions of synapses in the human brain are developed from only 20,000 genes, which inspires us to design an efficient genetic encoding strategy that dynamic evolves to regulate large-scale deep SNNs at low cost. Therefore, we first propose a genetically scaled SNN encoding scheme that incorporates globally shared genetic interactions to indirectly optimize neuronal encoding instead of weight, which obviously brings about reductions in parameters and energy consumption. Then, a spatio-temporal evolutionary framework is designed to optimize the inherently initial wiring rules. Two dynamic regularization operators in the fitness function evolve the neuronal encoding to a suitable distribution and enhance information quality of the genetic interaction respectively, substantially accelerating evolutionary speed and improving efficiency. Experiments show that our approach compresses parameters by approximately 50\% to 80\%, while outperforming models on the same architectures by 0.21\% to 4.38\% on CIFAR-10, CIFAR-100 and ImageNet. In summary, the consistent trends of the proposed genetically encoded spatio-temporal evolution across different datasets and architectures highlight its significant enhancements in terms of efficiency, broad scalability and robustness, demonstrating the advantages of the brain-inspired evolutionary genetic coding for SNN optimization.
Related papers
- Temporal Misalignment and Probabilistic Neurons [17.73940693302129]
Spiking Neural Networks (SNNs) offer a more energy-efficient alternative to Artificial Neural Networks (ANNs)
In this work, we identify a phenomenon in the ANN-SNN conversion framework, termed temporal misalignment.
We introduce biologically plausible two-phase probabilistic (TPP) spiking neurons, further enhancing the conversion process.
arXiv Detail & Related papers (2025-02-20T12:09:30Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.
A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.
The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Bio-Inspired Adaptive Neurons for Dynamic Weighting in Artificial Neural Networks [6.931200003384122]
Traditional neural networks employ fixed weights during inference, limiting their ability to adapt to changing input conditions.
We propose a novel framework for adaptive neural networks, where neuron weights are modeled as functions of the input signal.
arXiv Detail & Related papers (2024-12-02T12:45:30Z) - NeuroLGP-SM: Scalable Surrogate-Assisted Neuroevolution for Deep Neural Networks [0.0]
Evolutionary algorithms play a crucial role in the architectural configuration and training of Artificial Deep Neural Networks (DNNs)
In this work, we use phenotypic distance vectors, outputted from DNNs, alongside Kriging Partial Least Squares (KPLS) to make them suitable for search.
Our proposed approach, named Neuro-Linear Genetic Programming surrogate model (NeuroLGP-SM), efficiently and accurately estimates DNN fitness without the need for complete evaluations.
arXiv Detail & Related papers (2024-04-12T19:15:38Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Optimizing Deep Neural Networks through Neuroevolution with Stochastic
Gradient Descent [18.70093247050813]
gradient descent (SGD) is dominant in training a deep neural network (DNN)
Neuroevolution is more in line with an evolutionary process and provides some key capabilities that are often unavailable in SGD.
A hierarchical cluster-based suppression algorithm is also developed to overcome similar weight updates among individuals for improving population diversity.
arXiv Detail & Related papers (2020-12-21T08:54:14Z) - Optimizing Memory Placement using Evolutionary Graph Reinforcement
Learning [56.83172249278467]
We introduce Evolutionary Graph Reinforcement Learning (EGRL), a method designed for large search spaces.
We train and validate our approach directly on the Intel NNP-I chip for inference.
We additionally achieve 28-78% speed-up compared to the native NNP-I compiler on all three workloads.
arXiv Detail & Related papers (2020-07-14T18:50:12Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Genetic Algorithmic Parameter Optimisation of a Recurrent Spiking Neural
Network Model [0.6767885381740951]
We use genetic algorithm (GA) to search for optimal parameters in recurrent spiking neural networks (SNNs)
We consider a cortical column based SNN comprising 1000 Izhikevich spiking neurons for computational efficiency and biologically realism.
We show that the GA optimal population size was within 16-20 while the crossover rate that returned the best fitness value was 0.95.
arXiv Detail & Related papers (2020-03-30T22:44:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.