Structural plasticity on an accelerated analog neuromorphic hardware
system
- URL: http://arxiv.org/abs/1912.12047v2
- Date: Wed, 30 Sep 2020 08:20:35 GMT
- Title: Structural plasticity on an accelerated analog neuromorphic hardware
system
- Authors: Sebastian Billaudelle, Benjamin Cramer, Mihai A. Petrovici, Korbinian
Schreiber, David Kappel, Johannes Schemmel, Karlheinz Meier
- Abstract summary: We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
- Score: 0.46180371154032884
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In computational neuroscience, as well as in machine learning, neuromorphic
devices promise an accelerated and scalable alternative to neural network
simulations. Their neural connectivity and synaptic capacity depends on their
specific design choices, but is always intrinsically limited. Here, we present
a strategy to achieve structural plasticity that optimizes resource allocation
under these constraints by constantly rewiring the pre- and gpostsynaptic
partners while keeping the neuronal fan-in constant and the connectome sparse.
In particular, we implemented this algorithm on the analog neuromorphic system
BrainScaleS-2. It was executed on a custom embedded digital processor located
on chip, accompanying the mixed-signal substrate of spiking neurons and synapse
circuits. We evaluated our implementation in a simple supervised learning
scenario, showing its ability to optimize the network topology with respect to
the nature of its training data, as well as its overall computational
efficiency.
Related papers
- A Realistic Simulation Framework for Analog/Digital Neuromorphic Architectures [73.65190161312555]
ARCANA is a spiking neural network simulator designed to account for the properties of mixed-signal neuromorphic circuits.
We show how the results obtained provide a reliable estimate of the behavior of the spiking neural network trained in software.
arXiv Detail & Related papers (2024-09-23T11:16:46Z) - Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Neuromorphic Auditory Perception by Neural Spiketrum [27.871072042280712]
We introduce a neural spike coding model called spiketrumtemporal, to transform the time-varying analog signals into efficient spike patterns.
The model provides a sparse and efficient coding scheme with precisely controllable spike rate that facilitates training of spiking neural networks in various auditory perception tasks.
arXiv Detail & Related papers (2023-09-11T13:06:19Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - ETLP: Event-based Three-factor Local Plasticity for online learning with
neuromorphic hardware [105.54048699217668]
We show a competitive performance in accuracy with a clear advantage in the computational complexity for Event-Based Three-factor Local Plasticity (ETLP)
We also show that when using local plasticity, threshold adaptation in spiking neurons and a recurrent topology are necessary to learntemporal patterns with a rich temporal structure.
arXiv Detail & Related papers (2023-01-19T19:45:42Z) - Spike-based local synaptic plasticity: A survey of computational models
and neuromorphic circuits [1.8464222520424338]
We review historical, bottom-up, and top-down approaches to modeling synaptic plasticity.
We identify computational primitives that can support low-latency and low-power hardware implementations of spike-based learning rules.
arXiv Detail & Related papers (2022-09-30T15:35:04Z) - The BrainScaleS-2 accelerated neuromorphic system with hybrid plasticity [0.0]
We describe the second generation of the BrainScaleS neuromorphic architecture, emphasizing applications enabled by this architecture.
It combines a custom accelerator core supporting the accelerated physical emulation of bio-inspired spiking neural network primitives with a tightly coupled digital processor and a digital event-routing network.
arXiv Detail & Related papers (2022-01-26T17:13:46Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Learning in Deep Neural Networks Using a Biologically Inspired Optimizer [5.144809478361604]
We propose a novel biologically inspired for artificial (ANNs) and spiking neural networks (SNNs)
GRAPES implements a weight-distribution dependent modulation of the error signal at each node of the neural network.
We show that this biologically inspired mechanism leads to a systematic improvement of the convergence rate of the network, and substantially improves classification accuracy of ANNs and SNNs.
arXiv Detail & Related papers (2021-04-23T13:50:30Z) - Ultra-Low-Power FDSOI Neural Circuits for Extreme-Edge Neuromorphic
Intelligence [2.6199663901387997]
In-memory computing mixed-signal neuromorphic architectures provide promising ultra-low-power solutions for edge-computing sensory-processing applications.
We present a set of mixed-signal analog/digital circuits that exploit the features of advanced Fully-Depleted Silicon on Insulator (FDSOI) integration processes.
arXiv Detail & Related papers (2020-06-25T09:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.