Contrastive-Signal-Dependent Plasticity: Forward-Forward Learning of
Spiking Neural Systems
- URL: http://arxiv.org/abs/2303.18187v2
- Date: Thu, 7 Dec 2023 06:09:41 GMT
- Title: Contrastive-Signal-Dependent Plasticity: Forward-Forward Learning of
Spiking Neural Systems
- Authors: Alexander Ororbia
- Abstract summary: We develop a neuro-mimetic architecture, composed of spiking neuronal units, where individual layers of neurons operate in parallel.
We propose an event-based generalization of forward-forward learning, which we call contrastive-signal-dependent plasticity (CSDP)
Our experimental results on several pattern datasets demonstrate that the CSDP process works well for training a dynamic recurrent spiking network capable of both classification and reconstruction.
- Score: 73.18020682258606
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We develop a neuro-mimetic architecture, composed of spiking neuronal units,
where individual layers of neurons operate in parallel and adapt their synaptic
efficacies without the use of feedback pathways. Specifically, we propose an
event-based generalization of forward-forward learning, which we call
contrastive-signal-dependent plasticity (CSDP), for a spiking neural system
that iteratively processes sensory input over a stimulus window. The dynamics
that underwrite this recurrent circuit entail computing the membrane potential
of each processing element, in each layer, as a function of local bottom-up,
top-down, and lateral signals, facilitating a dynamic, layer-wise parallel form
of neural computation. Unlike other models, such as spiking predictive coding,
which rely on feedback synapses to adjust neural electrical activity, our model
operates purely online and forward in time, offering a promising way to learn
distributed representations of sensory data patterns, with and without labeled
context information. Notably, our experimental results on several pattern
datasets demonstrate that the CSDP process works well for training a dynamic
recurrent spiking network capable of both classification and reconstruction.
Related papers
- Evolution imposes an inductive bias that alters and accelerates learning dynamics [49.1574468325115]
We investigate the effect of evolutionary optimization on the learning dynamics of neural networks.<n>We combined algorithms natural selection and online learning to produce a method for evolutionarily conditioning artificial neural networks.<n>Results suggest evolution constitutes an inductive bias that tunes neural systems to enable rapid learning.
arXiv Detail & Related papers (2025-05-15T18:50:57Z) - Retinal Vessel Segmentation via Neuron Programming [17.609169389489633]
This paper introduces a novel approach to neural network design, termed neuron programming'', to enhance a network's representation ability at the neuronal level.
Comprehensive experiments validate that neuron programming can achieve competitive performance in retinal blood segmentation.
arXiv Detail & Related papers (2024-11-17T16:03:30Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Evolving Self-Assembling Neural Networks: From Spontaneous Activity to Experience-Dependent Learning [7.479827648985631]
We propose a class of self-organizing neural networks capable of synaptic and structural plasticity in an activity and reward-dependent manner.
Our results demonstrate the ability of the model to learn from experiences in different control tasks starting from randomly connected or empty networks.
arXiv Detail & Related papers (2024-06-14T07:36:21Z) - Brain-Inspired Machine Intelligence: A Survey of
Neurobiologically-Plausible Credit Assignment [65.268245109828]
We examine algorithms for conducting credit assignment in artificial neural networks that are inspired or motivated by neurobiology.
We organize the ever-growing set of brain-inspired learning schemes into six general families and consider these in the context of backpropagation of errors.
The results of this review are meant to encourage future developments in neuro-mimetic systems and their constituent learning processes.
arXiv Detail & Related papers (2023-12-01T05:20:57Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Evolving spiking neuron cellular automata and networks to emulate in
vitro neuronal activity [0.0]
We produce spiking neural systems that emulate the patterns of behavior of biological neurons in vitro.
Our models were able to produce a level of network-wide synchrony.
The genomes of the top-performing models indicate the excitability and density of connections in the model play an important role in determining the complexity of the produced activity.
arXiv Detail & Related papers (2021-10-15T17:55:04Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.