An Adiabatic Capacitive Artificial Neuron with RRAM-based Threshold
Detection for Energy-Efficient Neuromorphic Computing
- URL: http://arxiv.org/abs/2202.01144v2
- Date: Sat, 18 Jun 2022 23:55:12 GMT
- Title: An Adiabatic Capacitive Artificial Neuron with RRAM-based Threshold
Detection for Energy-Efficient Neuromorphic Computing
- Authors: Sachin Maheshwari, Alexander Serb, Christos Papavassiliou,
Themistoklis Prodromakis
- Abstract summary: We present an artificial neuron featuring adiabatic synapse capacitors to produce membrane potentials for the somas of neurons.
Our initial 4-bit adiabatic capacitive neuron proof-of-concept example shows 90% synaptic energy saving.
- Score: 62.997667081978825
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the quest for low power, bio-inspired computation both memristive and
memcapacitive-based Artificial Neural Networks (ANN) have been the subjects of
increasing focus for hardware implementation of neuromorphic computing. One
step further, regenerative capacitive neural networks, which call for the use
of adiabatic computing, offer a tantalising route towards even lower energy
consumption, especially when combined with `memimpedace' elements. Here, we
present an artificial neuron featuring adiabatic synapse capacitors to produce
membrane potentials for the somas of neurons; the latter implemented via
dynamic latched comparators augmented with Resistive Random-Access Memory
(RRAM) devices. Our initial 4-bit adiabatic capacitive neuron proof-of-concept
example shows 90% synaptic energy saving. At 4 synapses/soma we already witness
an overall 35% energy reduction. Furthermore, the impact of process and
temperature on the 4-bit adiabatic synapse shows a maximum energy variation of
30% at 100 degree Celsius across the corners without any functionality loss.
Finally, the efficacy of our adiabatic approach to ANN is tested for 512 & 1024
synapse/neuron for worst and best case synapse loading conditions and variable
equalising capacitance's quantifying the expected trade-off between
equalisation capacitance and range of optimal power-clock frequencies vs.
loading (i.e. the percentage of active synapses).
Related papers
- Resistive Memory-based Neural Differential Equation Solver for Score-based Diffusion Model [55.116403765330084]
Current AIGC methods, such as score-based diffusion, are still deficient in terms of rapidity and efficiency.
We propose a time-continuous and analog in-memory neural differential equation solver for score-based diffusion.
We experimentally validate our solution with 180 nm resistive memory in-memory computing macros.
arXiv Detail & Related papers (2024-04-08T16:34:35Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Synaptic Stripping: How Pruning Can Bring Dead Neurons Back To Life [0.0]
We introduce Synaptic Stripping as a means to combat the dead neuron problem.
By automatically removing problematic connections during training, we can regenerate dead neurons.
We conduct several ablation studies to investigate these dynamics as a function of network width and depth.
arXiv Detail & Related papers (2023-02-11T23:55:50Z) - Sequence learning in a spiking neuronal network with memristive synapses [0.0]
A core concept that lies at the heart of brain computation is sequence learning and prediction.
Neuromorphic hardware emulates the way the brain processes information and maps neurons and synapses directly into a physical substrate.
We study the feasibility of using ReRAM devices as a replacement of the biological synapses in the sequence learning model.
arXiv Detail & Related papers (2022-11-29T21:07:23Z) - Energy-Efficient High-Accuracy Spiking Neural Network Inference Using
Time-Domain Neurons [0.18352113484137625]
This paper presents a low-power highly linear time-domain I&F neuron circuit.
The proposed neuron leads to more than 4.3x lower error rate on the MNIST inference.
The power consumed by the proposed neuron circuit is simulated to be 0.230uW per neuron, which is orders of magnitude lower than the existing voltage-domain neurons.
arXiv Detail & Related papers (2022-02-04T08:24:03Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Thermal-Aware Compilation of Spiking Neural Networks to Neuromorphic
Hardware [0.30458514384586394]
We propose a technique to map neurons and synapses of SNN-based machine learning workloads to neuromorphic hardware.
We demonstrate an average 11.4 reduction in the average temperature of each crossbar in the hardware, leading to a 52% reduction in the leakage power consumption.
arXiv Detail & Related papers (2020-10-09T19:29:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.