Contrastive-Signal-Dependent Plasticity: Forward-Forward Learning of
Spiking Neural Systems
- URL: http://arxiv.org/abs/2303.18187v2
- Date: Thu, 7 Dec 2023 06:09:41 GMT
- Title: Contrastive-Signal-Dependent Plasticity: Forward-Forward Learning of
Spiking Neural Systems
- Authors: Alexander Ororbia
- Abstract summary: We develop a neuro-mimetic architecture, composed of spiking neuronal units, where individual layers of neurons operate in parallel.
We propose an event-based generalization of forward-forward learning, which we call contrastive-signal-dependent plasticity (CSDP)
Our experimental results on several pattern datasets demonstrate that the CSDP process works well for training a dynamic recurrent spiking network capable of both classification and reconstruction.
- Score: 73.18020682258606
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop a neuro-mimetic architecture, composed of spiking neuronal units,
where individual layers of neurons operate in parallel and adapt their synaptic
efficacies without the use of feedback pathways. Specifically, we propose an
event-based generalization of forward-forward learning, which we call
contrastive-signal-dependent plasticity (CSDP), for a spiking neural system
that iteratively processes sensory input over a stimulus window. The dynamics
that underwrite this recurrent circuit entail computing the membrane potential
of each processing element, in each layer, as a function of local bottom-up,
top-down, and lateral signals, facilitating a dynamic, layer-wise parallel form
of neural computation. Unlike other models, such as spiking predictive coding,
which rely on feedback synapses to adjust neural electrical activity, our model
operates purely online and forward in time, offering a promising way to learn
distributed representations of sensory data patterns, with and without labeled
context information. Notably, our experimental results on several pattern
datasets demonstrate that the CSDP process works well for training a dynamic
recurrent spiking network capable of both classification and reconstruction.
Related papers
- DYNAP-SE2: a scalable multi-core dynamic neuromorphic asynchronous
spiking neural network processor [2.9175555050594975]
We present a brain-inspired platform for prototyping real-time event-based Spiking Neural Networks (SNNs)
The system proposed supports the direct emulation of dynamic and realistic neural processing phenomena such as short-term plasticity, NMDA gating, AMPA diffusion, homeostasis, spike frequency adaptation, conductance-based dendritic compartments and spike transmission delays.
The flexibility to emulate different biologically plausible neural networks, and the chip's ability to monitor both population and single neuron signals in real-time, allow to develop and validate complex models of neural processing for both basic research and edge-computing applications.
arXiv Detail & Related papers (2023-10-01T03:48:16Z) - The Predictive Forward-Forward Algorithm [79.07468367923619]
We propose the predictive forward-forward (PFF) algorithm for conducting credit assignment in neural systems.
We design a novel, dynamic recurrent neural system that learns a directed generative circuit jointly and simultaneously with a representation circuit.
PFF efficiently learns to propagate learning signals and updates synapses with forward passes only.
arXiv Detail & Related papers (2023-01-04T05:34:48Z) - Convolutional Neural Generative Coding: Scaling Predictive Coding to
Natural Images [79.07468367923619]
We develop convolutional neural generative coding (Conv-NGC)
We implement a flexible neurobiologically-motivated algorithm that progressively refines latent state maps.
We study the effectiveness of our brain-inspired neural system on the tasks of reconstruction and image denoising.
arXiv Detail & Related papers (2022-11-22T06:42:41Z) - Spike-based local synaptic plasticity: A survey of computational models
and neuromorphic circuits [1.8464222520424338]
We review historical, bottom-up, and top-down approaches to modeling synaptic plasticity.
We identify computational primitives that can support low-latency and low-power hardware implementations of spike-based learning rules.
arXiv Detail & Related papers (2022-09-30T15:35:04Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Neuronal Learning Analysis using Cycle-Consistent Adversarial Networks [4.874780144224057]
We use a variant of deep generative models called - CycleGAN, to learn the unknown mapping between pre- and post-learning neural activities.
We develop an end-to-end pipeline to preprocess, train and evaluate calcium fluorescence signals, and a procedure to interpret the resulting deep learning models.
arXiv Detail & Related papers (2021-11-25T13:24:19Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Structural plasticity on an accelerated analog neuromorphic hardware
system [0.46180371154032884]
We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
arXiv Detail & Related papers (2019-12-27T10:15:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.