A Case for Lifetime Reliability-Aware Neuromorphic Computing
- URL: http://arxiv.org/abs/2007.02210v1
- Date: Sat, 4 Jul 2020 23:53:13 GMT
- Title: A Case for Lifetime Reliability-Aware Neuromorphic Computing
- Authors: Shihao Song and Anup Das
- Abstract summary: We evaluate the long-term, i.e., lifetime reliability impact of executing state-of-the-art machine learning tasks on a neuromorphic hardware.
We show the reliability-performance trade-off obtained due to periodic relaxation of neuromorphic circuits.
- Score: 0.30458514384586394
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Neuromorphic computing with non-volatile memory (NVM) can significantly
improve performance and lower energy consumption of machine learning tasks
implemented using spike-based computations and bio-inspired learning
algorithms. High voltages required to operate certain NVMs such as phase-change
memory (PCM) can accelerate aging in a neuron's CMOS circuit, thereby reducing
the lifetime of neuromorphic hardware. In this work, we evaluate the long-term,
i.e., lifetime reliability impact of executing state-of-the-art machine
learning tasks on a neuromorphic hardware, considering failure models such as
negative bias temperature instability (NBTI) and time-dependent dielectric
breakdown (TDDB). Based on such formulation, we show the
reliability-performance trade-off obtained due to periodic relaxation of
neuromorphic circuits, i.e., a stop-and-go style of neuromorphic computing.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Resistive Memory-based Neural Differential Equation Solver for Score-based Diffusion Model [55.116403765330084]
Current AIGC methods, such as score-based diffusion, are still deficient in terms of rapidity and efficiency.
We propose a time-continuous and analog in-memory neural differential equation solver for score-based diffusion.
We experimentally validate our solution with 180 nm resistive memory in-memory computing macros.
arXiv Detail & Related papers (2024-04-08T16:34:35Z) - TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential
Modelling [54.97005925277638]
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
It remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues.
We propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF.
arXiv Detail & Related papers (2023-08-25T08:54:41Z) - Long Short-term Memory with Two-Compartment Spiking Neuron [64.02161577259426]
We propose a novel biologically inspired Long Short-Term Memory Leaky Integrate-and-Fire spiking neuron model, dubbed LSTM-LIF.
Our experimental results, on a diverse range of temporal classification tasks, demonstrate superior temporal classification capability, rapid training convergence, strong network generalizability, and high energy efficiency of the proposed LSTM-LIF model.
This work, therefore, opens up a myriad of opportunities for resolving challenging temporal processing tasks on emerging neuromorphic computing machines.
arXiv Detail & Related papers (2023-07-14T08:51:03Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Dynamic Reliability Management in Neuromorphic Computing [8.616676521313815]
Neuromorphic computing systems use non-volatile memory (NVM) to implement high-density and low-energy synaptic storage.
Currents needed to operate NVMs cause aging of CMOS-based transistors in each neuron and synapse circuit in the hardware.
We propose a new technique to mitigate the aging-related reliability problems in neuromorphic systems, by designing an intelligent run-time manager.
arXiv Detail & Related papers (2021-05-05T13:17:17Z) - Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning [11.781094547718595]
We derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns.
We have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
arXiv Detail & Related papers (2021-04-21T18:23:31Z) - Controllable reset behavior in domain wall-magnetic tunnel junction
artificial neurons for task-adaptable computation [1.4505273244528207]
Domain wall-magnetic tunnel junction (DW-MTJ) devices have been shown to be able to intrinsically capture biological neuron behavior.
We show that edgy-relaxed behavior can be implemented in DW-MTJ artificial neurons via three alternative mechanisms.
arXiv Detail & Related papers (2021-01-08T16:50:29Z) - Reliability-Performance Trade-offs in Neuromorphic Computing [0.30458514384586394]
Neuromorphic architectures built with Non-Volatile Memory (NVM) can significantly improve the energy efficiency of machine learning tasks designed with Spiking Neural Networks (SNNs)
We observe that the parasitic voltage drops create a significant asymmetry in programming speed and reliability of NVM cells in a crossbar.
This asymmetry in neuromorphic architectures create reliability-performance trade-offs, which can be exploited efficiently using SNN mapping techniques.
arXiv Detail & Related papers (2020-09-26T19:38:18Z) - Improving Dependability of Neuromorphic Computing With Non-Volatile
Memory [5.306819482496464]
This paper proposes RENEU, a reliability-oriented approach to map machine learning applications to neuromorphic hardware.
Fundamental to RENEU is a novel formulation of the aging of CMOS-based circuits in a neuromorphic hardware considering different failure mechanisms.
Our results demonstrate an average 38% reduction in circuit aging, leading to an average 18% improvement in the lifetime of the hardware compared to current practices.
arXiv Detail & Related papers (2020-06-10T14:50:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.