Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data
- URL: http://arxiv.org/abs/2005.02211v1
- Date: Tue, 5 May 2020 14:16:54 GMT
- Title: Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data
- Authors: Alessandro Salatiello and Martin A. Giese
- Abstract summary: We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
- Score: 77.92736596690297
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recurrent Neural Networks (RNNs) are popular models of brain function. The
typical training strategy is to adjust their input-output behavior so that it
matches that of the biological circuit of interest. Even though this strategy
ensures that the biological and artificial networks perform the same
computational task, it does not guarantee that their internal activity dynamics
match. This suggests that the trained RNNs might end up performing the task
employing a different internal computational mechanism, which would make them a
suboptimal model of the biological circuit. In this work, we introduce a novel
training strategy that allows learning not only the input-output behavior of an
RNN but also its internal network dynamics, based on sparse neural recordings.
We test the proposed method by training an RNN to simultaneously reproduce
internal dynamics and output signals of a physiologically-inspired neural
model. Specifically, this model generates the multiphasic muscle-like activity
patterns typically observed during the execution of reaching movements, based
on the oscillatory activation patterns concurrently observed in the motor
cortex. Remarkably, we show that the reproduction of the internal dynamics is
successful even when the training algorithm relies on the activities of a small
subset of neurons sampled from the biological network. Furthermore, we show
that training the RNNs with this method significantly improves their
generalization performance. Overall, our results suggest that the proposed
method is suitable for building powerful functional RNN models, which
automatically capture important computational properties of the biological
circuit of interest from sparse neural recordings.
Related papers
- Exploiting Heterogeneity in Timescales for Sparse Recurrent Spiking Neural Networks for Energy-Efficient Edge Computing [16.60622265961373]
Spiking Neural Networks (SNNs) represent the forefront of neuromorphic computing.
This paper weaves together three groundbreaking studies that revolutionize SNN performance.
arXiv Detail & Related papers (2024-07-08T23:33:12Z) - Topological Representations of Heterogeneous Learning Dynamics of Recurrent Spiking Neural Networks [16.60622265961373]
Spiking Neural Networks (SNNs) have become an essential paradigm in neuroscience and artificial intelligence.
Recent advances in literature have studied the network representations of deep neural networks.
arXiv Detail & Related papers (2024-03-19T05:37:26Z) - Benchmarking Spiking Neural Network Learning Methods with Varying
Locality [2.323924801314763]
Spiking Neural Networks (SNNs) provide more realistic neuronal dynamics.
Information is processed as spikes within SNNs in an event-based mechanism.
We show that training SNNs is challenging due to the non-differentiable nature of the spiking mechanism.
arXiv Detail & Related papers (2024-02-01T19:57:08Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - SIT: A Bionic and Non-Linear Neuron for Spiking Neural Network [12.237928453571636]
Spiking Neural Networks (SNNs) have piqued researchers' interest because of their capacity to process temporal information and low power consumption.
Current state-of-the-art methods limited their biological plausibility and performance because their neurons are generally built on the simple Leaky-Integrate-and-Fire (LIF) model.
Due to the high level of dynamic complexity, modern neuron models have seldom been implemented in SNN practice.
arXiv Detail & Related papers (2022-03-30T07:50:44Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Neuroevolution of a Recurrent Neural Network for Spatial and Working
Memory in a Simulated Robotic Environment [57.91534223695695]
We evolved weights in a biologically plausible recurrent neural network (RNN) using an evolutionary algorithm to replicate the behavior and neural activity observed in rats.
Our method demonstrates how the dynamic activity in evolved RNNs can capture interesting and complex cognitive behavior.
arXiv Detail & Related papers (2021-02-25T02:13:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.