Multi-Objective Optimisation of Cortical Spiking Neural Networks With
Genetic Algorithms
- URL: http://arxiv.org/abs/2105.06824v1
- Date: Fri, 14 May 2021 13:35:39 GMT
- Title: Multi-Objective Optimisation of Cortical Spiking Neural Networks With
Genetic Algorithms
- Authors: James Fitzgerald and KongFatt Wong-Lin
- Abstract summary: Spiking neural networks (SNNs) communicate through the all-or-none spiking activity of neurons.
Previous work using genetic algorithm (GA) optimisation on an efficient SNN model was limited to a single parameter and objective.
This work applied a version of GA, called non-dominated sorting GA (NSGA-III), to demonstrate the feasibility of performing multi-objective optimisation on the same SNN.
- Score: 0.7995360025953929
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking neural networks (SNNs) communicate through the all-or-none spiking
activity of neurons. However, fitting the large number of SNN model parameters
to observed neural activity patterns, for example, in biological experiments,
remains a challenge. Previous work using genetic algorithm (GA) optimisation on
a specific efficient SNN model, using the Izhikevich neuronal model, was
limited to a single parameter and objective. This work applied a version of GA,
called non-dominated sorting GA (NSGA-III), to demonstrate the feasibility of
performing multi-objective optimisation on the same SNN, focusing on searching
network connectivity parameters to achieve target firing rates of excitatory
and inhibitory neuronal types, including across different network connectivity
sparsity. We showed that NSGA-III could readily optimise for various firing
rates. Notably, when the excitatory neural firing rates were higher than or
equal to that of inhibitory neurons, the errors were small. Moreover, when
connectivity sparsity was considered as a parameter to be optimised, the
optimal solutions required sparse network connectivity. We also found that for
excitatory neural firing rates lower than that of inhibitory neurons, the
errors were generally larger. Overall, we have successfully demonstrated the
feasibility of implementing multi-objective GA optimisation on network
parameters of recurrent and sparse SNN.
Related papers
- Unveiling the Power of Sparse Neural Networks for Feature Selection [60.50319755984697]
Sparse Neural Networks (SNNs) have emerged as powerful tools for efficient feature selection.
We show that SNNs trained with dynamic sparse training (DST) algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
Our findings show that feature selection with SNNs trained with DST algorithms can achieve, on average, more than $50%$ memory and $55%$ FLOPs reduction.
arXiv Detail & Related papers (2024-08-08T16:48:33Z) - Graph Neural Networks for Learning Equivariant Representations of Neural Networks [55.04145324152541]
We propose to represent neural networks as computational graphs of parameters.
Our approach enables a single model to encode neural computational graphs with diverse architectures.
We showcase the effectiveness of our method on a wide range of tasks, including classification and editing of implicit neural representations.
arXiv Detail & Related papers (2024-03-18T18:01:01Z) - Evolutionary Multi-objective Optimisation in Neurotrajectory Prediction [0.0]
This work makes a progressive step forward in neuroevolution for vehicle trajectory prediction.
To this end, rich ANNs composed of CNNs and Long-short Term Memory Network are adopted.
Two well-known and robust Multi-objective Evolutionary optimisation (EMO) algorithms, NSGA-II and MOEA/D are also adopted.
arXiv Detail & Related papers (2023-08-04T21:06:26Z) - High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Deep Kronecker neural networks: A general framework for neural networks
with adaptive activation functions [4.932130498861987]
We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions.
Under suitable conditions, KNNs induce a faster decay of the loss than that by the feed-forward networks.
arXiv Detail & Related papers (2021-05-20T04:54:57Z) - Accurate and efficient time-domain classification with adaptive spiking
recurrent neural networks [1.8515971640245998]
Spiking neural networks (SNNs) have been investigated as more biologically plausible and potentially more powerful models of neural computation.
We show how a novel surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields state-of-the-art for SNNs.
arXiv Detail & Related papers (2021-03-12T10:27:29Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Genetic Algorithmic Parameter Optimisation of a Recurrent Spiking Neural
Network Model [0.6767885381740951]
We use genetic algorithm (GA) to search for optimal parameters in recurrent spiking neural networks (SNNs)
We consider a cortical column based SNN comprising 1000 Izhikevich spiking neurons for computational efficiency and biologically realism.
We show that the GA optimal population size was within 16-20 while the crossover rate that returned the best fitness value was 0.95.
arXiv Detail & Related papers (2020-03-30T22:44:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.