Bridging Quantized Artificial Neural Networks and Neuromorphic Hardware
- URL: http://arxiv.org/abs/2505.12221v2
- Date: Sun, 22 Jun 2025 14:37:05 GMT
- Title: Bridging Quantized Artificial Neural Networks and Neuromorphic Hardware
- Authors: Zhenhui Chen, Haoran Xu, Yangfan Hu, Xiaofei Jin, Xinyu Li, Ziyang Kang, Gang Pan, De Ma,
- Abstract summary: Neuromorphic hardware aims to leverage distributed computing and event-driven circuit design to achieve an energy-efficient AI system.<n>In neuromorphic hardware, neurons use single-bit, event-driven data (called spikes) for inter-communication.<n>We propose a Spiking-Driven ANN framework that directly implements quantized ANN on hardware.
- Score: 21.100919594341317
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic hardware aims to leverage distributed computing and event-driven circuit design to achieve an energy-efficient AI system. The name "neuromorphic" is derived from its spiking and local computing nature, which mimics the fundamental activity of an animal's nervous system. In neuromorphic hardware, neurons, i.e., computing cores use single-bit, event-driven data (called spikes) for inter-communication, which differs substantially from conventional hardware. To leverage the advantages of neuromorphic hardware and implement a computing model, the conventional approach is to build spiking neural networks (SNNs). SNNs replace the nonlinearity part of artificial neural networks (ANNs) in the realm of deep learning with spiking neurons, where the spiking neuron mimics the basic behavior of bio-neurons. However, there is still a performance gap between SNNs and their ANN counterparts. In this paper, we explore a new way to map computing models onto neuromorphic hardware. We propose a Spiking-Driven ANN (SDANN) framework that directly implements quantized ANN on hardware, eliminating the need for tuning the trainable parameters or any performance degradation. With the power of quantized ANN, our SDANN ensures a lower bound of implementation performance on neuromorphic hardware. To address the limitation of bit width support on hardware, we propose bias calibration and scaled integration methods. Experiments on various tasks demonstrate that our SDANN achieves exactly the same accuracy as the quantized ANN. Beyond toy examples and software implementation, we successfully deployed and validated our spiking models on real neuromorphic hardware, demonstrating the feasibility of the SDANN framework.
Related papers
- Towards Efficient Deployment of Hybrid SNNs on Neuromorphic and Edge AI Hardware [0.493599216374976]
This paper explores the synergistic potential of neuromorphic and edge computing to create a versatile machine learning (ML) system tailored for processing data captured by dynamic vision sensors.
We construct and train hybrid models, blending spiking neural networks (SNNs) and artificial neural networks (ANNs) using PyTorch and Lava frameworks.
arXiv Detail & Related papers (2024-07-11T17:40:39Z) - Detection of Fast-Moving Objects with Neuromorphic Hardware [12.323012135924374]
Spiking Neural Networks (SNNs) are often viewed as the next generation of Neural Networks (NNs)
Neuromorphic Computing (NC) and SNNs in particular are often viewed as the next generation of Neural Networks (NNs)
arXiv Detail & Related papers (2024-03-15T20:53:10Z) - Efficient Event-Based Object Detection: A Hybrid Neural Network with Spatial and Temporal Attention [2.5075774828443467]
Spiking Neural Networks (SNNs) on neuromorphic hardware are often considered for energy-efficient and low latency event-based data processing.<n>Here, we introduce Attention-based Hybrid SNN-ANN backbones for event-based object detection.
arXiv Detail & Related papers (2024-03-15T10:28:31Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Accelerating spiking neural network training [1.6114012813668934]
Spiking neural networks (SNN) are a type of artificial network inspired by the use of action potentials in the brain.
We propose a new technique for directly training single-spike-per-neur-on SNNs which eliminates all sequential computation and relies exclusively on vectorised operations.
Our proposed solution manages to solve certain tasks with over a $95.68 %$ reduction in spike counts relative to a conventionally trained SNN, which could significantly reduce energy requirements when deployed on neuromorphic computers.
arXiv Detail & Related papers (2022-05-30T17:48:14Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - A Spiking Neural Network for Image Segmentation [3.4998703934432682]
We convert the deep Artificial Neural Network (ANN) architecture U-Net to a Spiking Neural Network (SNN) architecture using the Nengo framework.
Both rate-based and spike-based models are trained and optimized for benchmarking performance and power.
The neuromorphic implementation on the Intel Loihi neuromorphic chip is over 2x more energy-efficient than conventional hardware.
arXiv Detail & Related papers (2021-06-16T16:23:18Z) - BSNN: Towards Faster and Better Conversion of Artificial Neural Networks
to Spiking Neural Networks with Bistable Neurons [8.555786938446133]
spiking neural network (SNN) computes and communicates information through discrete binary events.
Recent work has achieved essential progress on an excellent performance by converting artificial neural networks (ANNs) to SNN.
We propose a novel bistable spiking neural network (BSNN) that addresses the problem of spikes of inactivated neurons (SIN) caused by the phase lead and phase lag.
arXiv Detail & Related papers (2021-05-27T02:38:02Z) - Compiling Spiking Neural Networks to Mitigate Neuromorphic Hardware
Constraints [0.30458514384586394]
Spiking Neural Networks (SNNs) are efficient of computation-constrained pattern recognition on resource- and power-constrained platforms.
SNNs executed on neuromorphic hardware can further reduce energy consumption of these platforms.
arXiv Detail & Related papers (2020-11-27T19:10:23Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.