To Spike or Not To Spike: A Digital Hardware Perspective on Deep
Learning Acceleration
- URL: http://arxiv.org/abs/2306.15749v5
- Date: Sun, 28 Jan 2024 11:23:43 GMT
- Title: To Spike or Not To Spike: A Digital Hardware Perspective on Deep
Learning Acceleration
- Authors: Fabrizio Ottati, Chang Gao, Qinyu Chen, Giovanni Brignone, Mario R.
Casu, Jason K. Eshraghian, Luciano Lavagno
- Abstract summary: As deep learning models scale, they become increasingly competitive from domains spanning from computer vision to natural language processing.
The power efficiency of the biological brain outperforms any large-scale deep learning ( DL ) model.
Neuromorphic computing tries to mimic the brain operations to improve the efficiency of DL models.
- Score: 4.712922151067433
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As deep learning models scale, they become increasingly competitive from
domains spanning from computer vision to natural language processing; however,
this happens at the expense of efficiency since they require increasingly more
memory and computing power. The power efficiency of the biological brain
outperforms any large-scale deep learning ( DL ) model; thus, neuromorphic
computing tries to mimic the brain operations, such as spike-based information
processing, to improve the efficiency of DL models. Despite the benefits of the
brain, such as efficient information transmission, dense neuronal
interconnects, and the co-location of computation and memory, the available
biological substrate has severely constrained the evolution of biological
brains. Electronic hardware does not have the same constraints; therefore,
while modeling spiking neural networks ( SNNs) might uncover one piece of the
puzzle, the design of efficient hardware backends for SNN s needs further
investigation, potentially taking inspiration from the available work done on
the artificial neural networks ( ANNs) side. As such, when is it wise to look
at the brain while designing new hardware, and when should it be ignored? To
answer this question, we quantitatively compare the digital hardware
acceleration techniques and platforms of ANNs and SNN s. As a result, we
provide the following insights: (i) ANNs currently process static data more
efficiently, (ii) applications targeting data produced by neuromorphic sensors,
such as event-based cameras and silicon cochleas, need more investigation since
the behavior of these sensors might naturally fit the SNN paradigm, and (iii)
hybrid approaches combining SNN s and ANNs might lead to the best solutions and
should be investigated further at the hardware level, accounting for both
efficiency and loss optimization.
Related papers
- Channel-wise Parallelizable Spiking Neuron with Multiplication-free Dynamics and Large Temporal Receptive Fields [32.349167886062105]
Spiking Neural Networks (SNNs) are distinguished from Artificial Neural Networks (ANNs) for their sophisticated neuronal dynamics and sparse binary activations (spikes) inspired by the biological neural system.
Traditional neuron models use iterative step-by-step dynamics, resulting in serial computation and slow training speed of SNNs.
Recent parallelizable spiking neuron models have been proposed to fully utilize the massive parallel computing ability of graphics processing units to accelerate the training of SNNs.
arXiv Detail & Related papers (2025-01-24T13:44:08Z) - Neuromorphic Wireless Split Computing with Multi-Level Spikes [69.73249913506042]
Neuromorphic computing uses spiking neural networks (SNNs) to perform inference tasks.
embedding a small payload within each spike exchanged between spiking neurons can enhance inference accuracy without increasing energy consumption.
split computing - where an SNN is partitioned across two devices - is a promising solution.
This paper presents the first comprehensive study of a neuromorphic wireless split computing architecture that employs multi-level SNNs.
arXiv Detail & Related papers (2024-11-07T14:08:35Z) - Research Advances and New Paradigms for Biology-inspired Spiking Neural Networks [8.315801422499861]
Spiking neural networks (SNNs) are gaining popularity in the computational simulation and artificial intelligence fields.
This paper explores the historical development of SNN and concludes that these two fields are intersecting and merging rapidly.
arXiv Detail & Related papers (2024-08-26T03:37:48Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - Computational and Storage Efficient Quadratic Neurons for Deep Neural
Networks [10.379191500493503]
Experimental results have demonstrated that the proposed quadratic neuron structure exhibits superior computational and storage efficiency across various tasks.
This work introduces an efficient quadratic neuron architecture distinguished by its enhanced utilization of second-order computational information.
arXiv Detail & Related papers (2023-06-10T11:25:31Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Exploiting Noise as a Resource for Computation and Learning in Spiking
Neural Networks [32.0086664373154]
This study introduces the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL)
NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation.
arXiv Detail & Related papers (2023-05-25T13:21:26Z) - Spiking Hyperdimensional Network: Neuromorphic Models Integrated with
Memory-Inspired Framework [8.910420030964172]
We propose SpikeHD, the first framework that fundamentally combines Spiking neural network and hyperdimensional computing.
SpikeHD exploits spiking neural networks to extract low-level features by preserving the spatial and temporal correlation of raw event-based spike data.
Our evaluation on a set of benchmark classification problems shows that SpikeHD provides the following benefit compared to SNN architecture.
arXiv Detail & Related papers (2021-10-01T05:01:21Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.