A Robust Learning Rule for Soft-Bounded Memristive Synapses Competitive
with Supervised Learning in Standard Spiking Neural Networks
- URL: http://arxiv.org/abs/2204.05682v1
- Date: Tue, 12 Apr 2022 10:21:22 GMT
- Title: A Robust Learning Rule for Soft-Bounded Memristive Synapses Competitive
with Supervised Learning in Standard Spiking Neural Networks
- Authors: Thomas F. Tiotto, Jelmer P. Borst and Niels A. Taatgen
- Abstract summary: A view in theoretical neuroscience sees the brain as a function-computing device.
Being able to approximate functions is a fundamental axiom to build upon for future brain research.
In this work we apply a novel supervised learning algorithm - based on controlling niobium-doped strontium titanate memristive synapses - to learning non-trivial multidimensional functions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Memristive devices are a class of circuit elements that shows great promise
as future building block for brain-inspired computing. One influential view in
theoretical neuroscience sees the brain as a function-computing device: given
input signals, the brain applies a function in order to generate new internal
states and motor outputs. Therefore, being able to approximate functions is a
fundamental axiom to build upon for future brain research and to derive more
efficient computational machines. In this work we apply a novel supervised
learning algorithm - based on controlling niobium-doped strontium titanate
memristive synapses - to learning non-trivial multidimensional functions. By
implementing our method into the spiking neural network simulator Nengo, we
show that we are able to at least match the performance obtained when using
ideal, linear synapses and - in doing so - that this kind of memristive device
can be harnessed as computational substrate to move towards more efficient,
brain-inspired computing.
Related papers
- Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Hebbian Learning based Orthogonal Projection for Continual Learning of
Spiking Neural Networks [74.3099028063756]
We develop a new method with neuronal operations based on lateral connections and Hebbian learning.
We show that Hebbian and anti-Hebbian learning on recurrent lateral connections can effectively extract the principal subspace of neural activities.
Our method consistently solves for spiking neural networks with nearly zero forgetting.
arXiv Detail & Related papers (2024-02-19T09:29:37Z) - A Neural Lambda Calculus: Neurosymbolic AI meets the foundations of
computing and functional programming [0.0]
We will analyze the ability of neural networks to learn how to execute programs as a whole.
We will introduce the use of integrated neural learning and calculi formalization.
arXiv Detail & Related papers (2023-04-18T20:30:16Z) - Sequence learning in a spiking neuronal network with memristive synapses [0.0]
A core concept that lies at the heart of brain computation is sequence learning and prediction.
Neuromorphic hardware emulates the way the brain processes information and maps neurons and synapses directly into a physical substrate.
We study the feasibility of using ReRAM devices as a replacement of the biological synapses in the sequence learning model.
arXiv Detail & Related papers (2022-11-29T21:07:23Z) - Encoding Integers and Rationals on Neuromorphic Computers using Virtual
Neuron [0.0]
We present the virtual neuron as an encoding mechanism for integers and rational numbers.
We show that it can perform an addition operation using 23 nJ of energy on average using a mixed-signal memristor-based neuromorphic processor.
arXiv Detail & Related papers (2022-08-15T23:18:26Z) - Memory-enriched computation and learning in spiking neural networks
through Hebbian plasticity [9.453554184019108]
Hebbian plasticity is believed to play a pivotal role in biological memory.
We introduce a novel spiking neural network architecture that is enriched by Hebbian synaptic plasticity.
We show that Hebbian enrichment renders spiking neural networks surprisingly versatile in terms of their computational as well as learning capabilities.
arXiv Detail & Related papers (2022-05-23T12:48:37Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Neurocoder: Learning General-Purpose Computation Using Stored Neural
Programs [64.56890245622822]
Neurocoder is an entirely new class of general-purpose conditional computational machines.
It "codes" itself in a data-responsive way by composing relevant programs from a set of shareable, modular programs.
We show new capacity to learn modular programs, handle severe pattern shifts and remember old programs as new ones are learnt.
arXiv Detail & Related papers (2020-09-24T01:39:16Z) - Spiking Neural Networks Hardware Implementations and Challenges: a
Survey [53.429871539789445]
Spiking Neural Networks are cognitive algorithms mimicking neuron and synapse operational principles.
We present the state of the art of hardware implementations of spiking neural networks.
We discuss the strategies employed to leverage the characteristics of these event-driven algorithms at the hardware level.
arXiv Detail & Related papers (2020-05-04T13:24:00Z) - Memristors -- from In-memory computing, Deep Learning Acceleration,
Spiking Neural Networks, to the Future of Neuromorphic and Bio-inspired
Computing [25.16076541420544]
Machine learning, particularly in the form of deep learning, has driven most of the recent fundamental developments in artificial intelligence.
Deep learning has been successfully applied in areas such as object/pattern recognition, speech and natural language processing, self-driving vehicles, intelligent self-diagnostics tools, autonomous robots, knowledgeable personal assistants, and monitoring.
This paper reviews the case for a novel beyond CMOS hardware technology, memristors, as a potential solution for the implementation of power-efficient in-memory computing, deep learning accelerators, and spiking neural networks.
arXiv Detail & Related papers (2020-04-30T16:49:03Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.