On the Mitigation of Read Disturbances in Neuromorphic Inference
Hardware
- URL: http://arxiv.org/abs/2201.11527v1
- Date: Thu, 27 Jan 2022 14:02:54 GMT
- Title: On the Mitigation of Read Disturbances in Neuromorphic Inference
Hardware
- Authors: Ankita Paul and Shihao Song and Twisha Titirsha and Anup Das
- Abstract summary: Non-Volatile Memory (NVM) cells are used in neuromorphic hardware to store model parameters.
NVM cells suffer from the read disturb issue, where the programmed resistance state drifts upon repeated access of a cell during inference.
We propose a system software framework to incorporate such dependencies in programming model parameters on NVM cells of a neuromorphic hardware.
- Score: 0.22940141855172028
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Non-Volatile Memory (NVM) cells are used in neuromorphic hardware to store
model parameters, which are programmed as resistance states. NVMs suffer from
the read disturb issue, where the programmed resistance state drifts upon
repeated access of a cell during inference. Resistance drifts can lower the
inference accuracy. To address this, it is necessary to periodically reprogram
model parameters (a high overhead operation). We study read disturb failures of
an NVM cell. Our analysis show both a strong dependency on model
characteristics such as synaptic activation and criticality, and on the voltage
used to read resistance states during inference. We propose a system software
framework to incorporate such dependencies in programming model parameters on
NVM cells of a neuromorphic hardware. Our framework consists of a convex
optimization formulation which aims to implement synaptic weights that have
more activations and are critical, i.e., those that have high impact on
accuracy on NVM cells that are exposed to lower voltages during inference. In
this way, we increase the time interval between two consecutive reprogramming
of model parameters. We evaluate our system software with many emerging
inference models on a neuromorphic hardware simulator and show a significant
reduction in the system overhead.
Related papers
- A Realistic Simulation Framework for Analog/Digital Neuromorphic Architectures [73.65190161312555]
ARCANA is a spiking neural network simulator designed to account for the properties of mixed-signal neuromorphic circuits.
We show how the results obtained provide a reliable estimate of the behavior of the spiking neural network trained in software.
arXiv Detail & Related papers (2024-09-23T11:16:46Z) - Resistive Memory-based Neural Differential Equation Solver for Score-based Diffusion Model [55.116403765330084]
Current AIGC methods, such as score-based diffusion, are still deficient in terms of rapidity and efficiency.
We propose a time-continuous and analog in-memory neural differential equation solver for score-based diffusion.
We experimentally validate our solution with 180 nm resistive memory in-memory computing macros.
arXiv Detail & Related papers (2024-04-08T16:34:35Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Efficient hierarchical Bayesian inference for spatio-temporal regression
models in neuroimaging [6.512092052306553]
Examples include M/EEG inverse problems, encoding neural models for task-based fMRI analyses, and temperature monitoring schemes.
We devise a novel hierarchical flexible Bayesian framework within which the intrinsic-temporal dynamics of model parameters and noise are modeled.
arXiv Detail & Related papers (2021-11-02T15:50:01Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Improving Inference Lifetime of Neuromorphic Systems via Intelligent
Synapse Mapping [0.2578242050187029]
An RRAM cell can switch its state after reading its content a certain number of times.
We propose an architectural solution to extend the read endurance of RRAM-based neuromorphic systems.
arXiv Detail & Related papers (2021-06-16T20:12:47Z) - Dynamic Reliability Management in Neuromorphic Computing [8.616676521313815]
Neuromorphic computing systems use non-volatile memory (NVM) to implement high-density and low-energy synaptic storage.
Currents needed to operate NVMs cause aging of CMOS-based transistors in each neuron and synapse circuit in the hardware.
We propose a new technique to mitigate the aging-related reliability problems in neuromorphic systems, by designing an intelligent run-time manager.
arXiv Detail & Related papers (2021-05-05T13:17:17Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Neural Closure Models for Dynamical Systems [35.000303827255024]
We develop a novel methodology to learn non-Markovian closure parameterizations for low-fidelity models.
New "neural closure models" augment low-fidelity models with neural delay differential equations (nDDEs)
We show that using non-Markovian over Markovian closures improves long-term accuracy and requires smaller networks.
arXiv Detail & Related papers (2020-12-27T05:55:33Z) - Lipschitz Recurrent Neural Networks [100.72827570987992]
We show that our Lipschitz recurrent unit is more robust with respect to input and parameter perturbations as compared to other continuous-time RNNs.
Our experiments demonstrate that the Lipschitz RNN can outperform existing recurrent units on a range of benchmark tasks.
arXiv Detail & Related papers (2020-06-22T08:44:52Z) - Improving Dependability of Neuromorphic Computing With Non-Volatile
Memory [5.306819482496464]
This paper proposes RENEU, a reliability-oriented approach to map machine learning applications to neuromorphic hardware.
Fundamental to RENEU is a novel formulation of the aging of CMOS-based circuits in a neuromorphic hardware considering different failure mechanisms.
Our results demonstrate an average 38% reduction in circuit aging, leading to an average 18% improvement in the lifetime of the hardware compared to current practices.
arXiv Detail & Related papers (2020-06-10T14:50:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.