Genetic Algorithmic Parameter Optimisation of a Recurrent Spiking Neural
Network Model
- URL: http://arxiv.org/abs/2003.13850v2
- Date: Wed, 27 May 2020 23:43:06 GMT
- Title: Genetic Algorithmic Parameter Optimisation of a Recurrent Spiking Neural
Network Model
- Authors: Ifeatu Ezenwe, Alok Joshi and KongFatt Wong-Lin
- Abstract summary: We use genetic algorithm (GA) to search for optimal parameters in recurrent spiking neural networks (SNNs)
We consider a cortical column based SNN comprising 1000 Izhikevich spiking neurons for computational efficiency and biologically realism.
We show that the GA optimal population size was within 16-20 while the crossover rate that returned the best fitness value was 0.95.
- Score: 0.6767885381740951
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks are complex algorithms that loosely model the behaviour of
the human brain. They play a significant role in computational neuroscience and
artificial intelligence. The next generation of neural network models is based
on the spike timing activity of neurons: spiking neural networks (SNNs).
However, model parameters in SNNs are difficult to search and optimise.
Previous studies using genetic algorithm (GA) optimisation of SNNs were focused
mainly on simple, feedforward, or oscillatory networks, but not much work has
been done on optimising cortex-like recurrent SNNs. In this work, we
investigated the use of GAs to search for optimal parameters in recurrent SNNs
to reach targeted neuronal population firing rates, e.g. as in experimental
observations. We considered a cortical column based SNN comprising 1000
Izhikevich spiking neurons for computational efficiency and biologically
realism. The model parameters explored were the neuronal biased input currents.
First, we found for this particular SNN, the optimal parameter values for
targeted population averaged firing activities, and the convergence of
algorithm by ~100 generations. We then showed that the GA optimal population
size was within ~16-20 while the crossover rate that returned the best fitness
value was ~0.95. Overall, we have successfully demonstrated the feasibility of
implementing GA to optimise model parameters in a recurrent cortical based SNN.
Related papers
- Spatial-Temporal Search for Spiking Neural Networks [32.937536365872745]
Spiking Neural Networks (SNNs) are considered as a potential candidate for the next generation of artificial intelligence.
We propose a differentiable approach to optimize SNN on both spatial and temporal dimensions.
Our methods achieve comparable classification performance of CIFAR10/100 and ImageNet with accuracies of 96.43%, 78.96%, and 70.21%, respectively.
arXiv Detail & Related papers (2024-10-24T09:32:51Z) - Unveiling the Power of Sparse Neural Networks for Feature Selection [60.50319755984697]
Sparse Neural Networks (SNNs) have emerged as powerful tools for efficient feature selection.
We show that SNNs trained with dynamic sparse training (DST) algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
Our findings show that feature selection with SNNs trained with DST algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
arXiv Detail & Related papers (2024-08-08T16:48:33Z) - Sparse Spiking Neural Network: Exploiting Heterogeneity in Timescales
for Pruning Recurrent SNN [19.551319330414085]
Spiking Neural Networks (RSNNs) have emerged as a computationally efficient and brain-inspired learning model.
Traditionally, sparse SNNs are obtained by first training a dense and complex SNN for a target task.
This paper presents a task-agnostic methodology for designing sparse RSNNs by pruning a large randomly model.
arXiv Detail & Related papers (2024-03-06T02:36:15Z) - High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Multi-Objective Optimisation of Cortical Spiking Neural Networks With
Genetic Algorithms [0.7995360025953929]
Spiking neural networks (SNNs) communicate through the all-or-none spiking activity of neurons.
Previous work using genetic algorithm (GA) optimisation on an efficient SNN model was limited to a single parameter and objective.
This work applied a version of GA, called non-dominated sorting GA (NSGA-III), to demonstrate the feasibility of performing multi-objective optimisation on the same SNN.
arXiv Detail & Related papers (2021-05-14T13:35:39Z) - Accurate and efficient time-domain classification with adaptive spiking
recurrent neural networks [1.8515971640245998]
Spiking neural networks (SNNs) have been investigated as more biologically plausible and potentially more powerful models of neural computation.
We show how a novel surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields state-of-the-art for SNNs.
arXiv Detail & Related papers (2021-03-12T10:27:29Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.