A Deep 2-Dimensional Dynamical Spiking Neuronal Network for Temporal
Encoding trained with STDP
- URL: http://arxiv.org/abs/2009.00581v1
- Date: Tue, 1 Sep 2020 17:12:18 GMT
- Title: A Deep 2-Dimensional Dynamical Spiking Neuronal Network for Temporal
Encoding trained with STDP
- Authors: Matthew Evanusa and Cornelia Fermuller and Yiannis Aloimonos
- Abstract summary: We show that a large, deep layered SNN with dynamical, chaotic activity mimicking the mammalian cortex is capable of encoding information from temporal data.
We argue that the randomness inherent in the network weights allow the neurons to form groups that encode the temporal data being inputted after self-organizing with STDP.
We analyze the network in terms of network entropy as a metric of information transfer.
- Score: 10.982390333064536
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The brain is known to be a highly complex, asynchronous dynamical system that
is highly tailored to encode temporal information. However, recent deep
learning approaches to not take advantage of this temporal coding. Spiking
Neural Networks (SNNs) can be trained using biologically-realistic learning
mechanisms, and can have neuronal activation rules that are biologically
relevant. This type of network is also structured fundamentally around
accepting temporal information through a time-decaying voltage update, a kind
of input that current rate-encoding networks have difficulty with. Here we show
that a large, deep layered SNN with dynamical, chaotic activity mimicking the
mammalian cortex with biologically-inspired learning rules, such as STDP, is
capable of encoding information from temporal data. We argue that the
randomness inherent in the network weights allow the neurons to form groups
that encode the temporal data being inputted after self-organizing with STDP.
We aim to show that precise timing of input stimulus is critical in forming
synchronous neural groups in a layered network. We analyze the network in terms
of network entropy as a metric of information transfer. We hope to tackle two
problems at once: the creation of artificial temporal neural systems for
artificial intelligence, as well as solving coding mechanisms in the brain.
Related papers
- A frugal Spiking Neural Network for unsupervised classification of continuous multivariate temporal data [0.0]
Spiking Neural Networks (SNNs) are neuromorphic and use more biologically plausible neurons with evolving membrane potentials.
We introduce here a frugal single-layer SNN designed for fully unsupervised identification and classification of multivariate temporal patterns in continuous data.
arXiv Detail & Related papers (2024-08-08T08:15:51Z) - Stochastic Spiking Neural Networks with First-to-Spike Coding [7.955633422160267]
Spiking Neural Networks (SNNs) are known for their bio-plausibility and energy efficiency.
In this work, we explore the merger of novel computing and information encoding schemes in SNN architectures.
We investigate the tradeoffs of our proposal in terms of accuracy, inference latency, spiking sparsity, energy consumption, and datasets.
arXiv Detail & Related papers (2024-04-26T22:52:23Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - A Spiking Neural Network based on Neural Manifold for Augmenting
Intracortical Brain-Computer Interface Data [5.039813366558306]
Brain-computer interfaces (BCIs) transform neural signals in the brain into in-structions to control external devices.
With the advent of advanced machine learning methods, the capability of brain-computer interfaces has been enhanced like never before.
Here, we use spiking neural networks (SNN) as data generators.
arXiv Detail & Related papers (2022-03-26T15:32:31Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning [11.781094547718595]
We derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns.
We have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
arXiv Detail & Related papers (2021-04-21T18:23:31Z) - Artificial Neural Variability for Deep Learning: On Overfitting, Noise
Memorization, and Catastrophic Forgetting [135.0863818867184]
artificial neural variability (ANV) helps artificial neural networks learn some advantages from natural'' neural networks.
ANV plays as an implicit regularizer of the mutual information between the training data and the learned model.
It can effectively relieve overfitting, label noise memorization, and catastrophic forgetting at negligible costs.
arXiv Detail & Related papers (2020-11-12T06:06:33Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Exploiting Neuron and Synapse Filter Dynamics in Spatial Temporal
Learning of Deep Spiking Neural Network [7.503685643036081]
A bio-plausible SNN model with spatial-temporal property is a complex dynamic system.
We formulate SNN as a network of infinite impulse response (IIR) filters with neuron nonlinearity.
We propose a training algorithm that is capable to learn spatial-temporal patterns by searching for the optimal synapse filter kernels and weights.
arXiv Detail & Related papers (2020-02-19T01:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.