Differential Coding for Training-Free ANN-to-SNN Conversion
- URL: http://arxiv.org/abs/2503.00301v1
- Date: Sat, 01 Mar 2025 02:17:35 GMT
- Title: Differential Coding for Training-Free ANN-to-SNN Conversion
- Authors: Zihan Huang, Wei Fang, Tong Bu, Peng Xue, Zecheng Hao, Wenxuan Liu, Yuanhong Tang, Zhaofei Yu, Tiejun Huang,
- Abstract summary: Spiking Neural Networks (SNNs) exhibit significant potential due to their low energy consumption.<n> converting Artificial Neural Networks (ANNs) to SNNs is an efficient way to achieve high-performance SNNs.<n>This article introduces differential coding for ANN-to-SNN conversion, a novel coding scheme that reduces spike counts and energy consumption.
- Score: 45.70141988713627
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking Neural Networks (SNNs) exhibit significant potential due to their low energy consumption. Converting Artificial Neural Networks (ANNs) to SNNs is an efficient way to achieve high-performance SNNs. However, many conversion methods are based on rate coding, which requires numerous spikes and longer time-steps compared to directly trained SNNs, leading to increased energy consumption and latency. This article introduces differential coding for ANN-to-SNN conversion, a novel coding scheme that reduces spike counts and energy consumption by transmitting changes in rate information rather than rates directly, and explores its application across various layers. Additionally, the threshold iteration method is proposed to optimize thresholds based on activation distribution when converting Rectified Linear Units (ReLUs) to spiking neurons. Experimental results on various Convolutional Neural Networks (CNNs) and Transformers demonstrate that the proposed differential coding significantly improves accuracy while reducing energy consumption, particularly when combined with the threshold iteration method, achieving state-of-the-art performance.
Related papers
- Adaptive Calibration: A Unified Conversion Framework of Spiking Neural Network [1.5215973379400674]
Spiking Neural Networks (SNNs) are seen as an energy-efficient alternative to traditional Artificial Neural Networks (ANNs)<n>We present a unified training-free conversion framework that significantly enhances both the performance and efficiency of converted SNNs.
arXiv Detail & Related papers (2024-12-18T09:38:54Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Training-free Conversion of Pretrained ANNs to SNNs for Low-Power and High-Performance Applications [23.502136316777058]
Spiking Neural Networks (SNNs) have emerged as a promising substitute for Artificial Neural Networks (ANNs)
Existing supervised learning algorithms for SNNs require significantly more memory and time than their ANN counterparts.
Our approach directly converts pre-trained ANN models into high-performance SNNs without additional training.
arXiv Detail & Related papers (2024-09-05T09:14:44Z) - One-Spike SNN: Single-Spike Phase Coding with Base Manipulation for ANN-to-SNN Conversion Loss Minimization [0.41436032949434404]
As spiking neural networks (SNNs) are event-driven, energy efficiency is higher than conventional artificial neural networks (ANNs)
In this work, we propose a single-spike phase coding as an encoding scheme that minimizes the number of spikes to transfer data between SNN layers.
Without any additional retraining or architectural constraints on ANNs, the proposed conversion method does not lose inference accuracy (0.58% on average) verified on three convolutional neural networks (CNNs) with CIFAR and ImageNet datasets.
arXiv Detail & Related papers (2024-01-30T02:00:28Z) - When Bio-Inspired Computing meets Deep Learning: Low-Latency, Accurate,
& Energy-Efficient Spiking Neural Networks from Artificial Neural Networks [22.721987637571306]
Spiking Neural Networks (SNNs) are demonstrating comparable accuracy to convolutional neural networks (CNN)
ANN-to-SNN conversion has recently gained significant traction in developing deep SNNs with close to state-of-the-art (SOTA) test accuracy on complex image recognition tasks.
We propose a novel ANN-to-SNN conversion framework, that incurs an exponentially lower number of time steps compared to that required in the SOTA conversion approaches.
arXiv Detail & Related papers (2023-12-12T00:10:45Z) - Spiking Neural Network Decision Feedback Equalization [70.3497683558609]
We propose an SNN-based equalizer with a feedback structure akin to the decision feedback equalizer (DFE)
We show that our approach clearly outperforms conventional linear equalizers for three different exemplary channels.
The proposed SNN with a decision feedback structure enables the path to competitive energy-efficient transceivers.
arXiv Detail & Related papers (2022-11-09T09:19:15Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Optimal Conversion of Conventional Artificial Neural Networks to Spiking
Neural Networks [0.0]
Spiking neural networks (SNNs) are biology-inspired artificial neural networks (ANNs)
We propose a novel strategic pipeline that transfers the weights to the target SNN by combining threshold balance and soft-reset mechanisms.
Our method is promising to get implanted onto embedded platforms with better support of SNNs with limited energy and memory.
arXiv Detail & Related papers (2021-02-28T12:04:22Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.