Temporal credit assignment for one-shot learning utilizing a phase
transition material
- URL: http://arxiv.org/abs/2310.00066v1
- Date: Fri, 29 Sep 2023 18:18:12 GMT
- Title: Temporal credit assignment for one-shot learning utilizing a phase
transition material
- Authors: Alessandro R. Galloni, Yifan Yuan, Minning Zhu, Haoming Yu, Ravindra
S. Bisht, Chung-Tse Michael Wu, Christine Grienberger, Shriram Ramanathan and
Aaron D. Milstein
- Abstract summary: We show that devices based on a metal-insulator-transition material, vanadium dioxide (VO2), can be dynamically controlled to access a continuum of intermediate resistance states.
We exploit these device properties to emulate three aspects of neuronal analog computation: fast (1 ms) spiking in a neuronal soma compartment, slow (100 ms) spiking in a dendritic compartment, and ultraslow (1 s) biochemical signaling involved in temporal credit assignment for a recently discovered biological mechanism of one-shot learning.
- Score: 36.460125256873624
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Design of hardware based on biological principles of neuronal computation and
plasticity in the brain is a leading approach to realizing energy- and
sample-efficient artificial intelligence and learning machines. An important
factor in selection of the hardware building blocks is the identification of
candidate materials with physical properties suitable to emulate the large
dynamic ranges and varied timescales of neuronal signaling. Previous work has
shown that the all-or-none spiking behavior of neurons can be mimicked by
threshold switches utilizing phase transitions. Here we demonstrate that
devices based on a prototypical metal-insulator-transition material, vanadium
dioxide (VO2), can be dynamically controlled to access a continuum of
intermediate resistance states. Furthermore, the timescale of their intrinsic
relaxation can be configured to match a range of biologically-relevant
timescales from milliseconds to seconds. We exploit these device properties to
emulate three aspects of neuronal analog computation: fast (~1 ms) spiking in a
neuronal soma compartment, slow (~100 ms) spiking in a dendritic compartment,
and ultraslow (~1 s) biochemical signaling involved in temporal credit
assignment for a recently discovered biological mechanism of one-shot learning.
Simulations show that an artificial neural network using properties of VO2
devices to control an agent navigating a spatial environment can learn an
efficient path to a reward in up to 4 fold fewer trials than standard methods.
The phase relaxations described in our study may be engineered in a variety of
materials, and can be controlled by thermal, electrical, or optical stimuli,
suggesting further opportunities to emulate biological learning.
Related papers
- Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Organic log-domain integrator synapse [2.1640200483378953]
Here, a physically flexible organic log-domain integrator synaptic circuit is shown to address this challenge.
Using a 10 nF synaptic capacitor, the time constant reached 126 ms before and 221 ms during bending, respectively.
The circuit is characterized before and during bending, followed by studies on the effects of weighting voltage, synaptic capacitance, and disparity in pre-synaptic signals on the time constant.
arXiv Detail & Related papers (2022-03-23T17:11:47Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning [11.781094547718595]
We derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns.
We have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
arXiv Detail & Related papers (2021-04-21T18:23:31Z) - Controllable reset behavior in domain wall-magnetic tunnel junction
artificial neurons for task-adaptable computation [1.4505273244528207]
Domain wall-magnetic tunnel junction (DW-MTJ) devices have been shown to be able to intrinsically capture biological neuron behavior.
We show that edgy-relaxed behavior can be implemented in DW-MTJ artificial neurons via three alternative mechanisms.
arXiv Detail & Related papers (2021-01-08T16:50:29Z) - Accelerated Analog Neuromorphic Computing [0.0]
This paper presents the concepts behind the BrainScales (BSS) accelerated analog neuromorphic computing architecture.
It describes the second-generation BrainScales-2 (BSS-2) version and its most recent in-silico realization, the HICANN-X Application Specific Integrated Circuit (ASIC)
The presented architecture is based upon a continuous-time, analog, physical model implementation of neurons and synapses.
arXiv Detail & Related papers (2020-03-26T16:00:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.