Bridge the Gap between SNN and ANN for Image Restoration
- URL: http://arxiv.org/abs/2504.01755v1
- Date: Wed, 02 Apr 2025 14:12:06 GMT
- Title: Bridge the Gap between SNN and ANN for Image Restoration
- Authors: Xin Su, Chen Wu, Zhuoran Zheng,
- Abstract summary: Currently, neural networks based on the SNN (Spiking Neural Network) framework are beginning to make their mark in the field of image restoration.<n>We propose a novel distillation technique, called asymmetric framework (ANN-SNN) distillation, in which the teacher is an ANN and the student is an SNN.<n>Specifically, we leverage the intermediate features (feature maps) learned by the ANN as hints to guide the training process of the SNN.
- Score: 7.487270862599671
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Models of dense prediction based on traditional Artificial Neural Networks (ANNs) require a lot of energy, especially for image restoration tasks. Currently, neural networks based on the SNN (Spiking Neural Network) framework are beginning to make their mark in the field of image restoration, especially as they typically use less than 10\% of the energy of ANNs with the same architecture. However, training an SNN is much more expensive than training an ANN, due to the use of the heuristic gradient descent strategy. In other words, the process of SNN's potential membrane signal changing from sparse to dense is very slow, which affects the convergence of the whole model.To tackle this problem, we propose a novel distillation technique, called asymmetric framework (ANN-SNN) distillation, in which the teacher is an ANN and the student is an SNN. Specifically, we leverage the intermediate features (feature maps) learned by the ANN as hints to guide the training process of the SNN. This approach not only accelerates the convergence of the SNN but also improves its final performance, effectively bridging the gap between the efficiency of the SNN and the superior learning capabilities of ANN. Extensive experimental results show that our designed SNN-based image restoration model, which has only 1/300 the number of parameters of the teacher network and 1/50 the energy consumption of the teacher network, is as good as the teacher network in some denoising tasks.
Related papers
- Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement [3.4776100606469096]
Spiking Neural Networks (SNNs) have garnered considerable attention as a potential alternative to Artificial Neural Networks (ANNs)<n>Recent studies have highlighted SNNs' potential on large-scale datasets.
arXiv Detail & Related papers (2025-03-20T09:04:38Z) - NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient
Training in Deep Spiking Neural Networks [7.0691139514420005]
Spiking Neural Networks (SNNs) are biologically realistic and practically promising in low-power because of their event-driven mechanism.
A conversion scheme is proposed to obtain competitive accuracy by mapping trained ANNs' parameters to SNNs with the same structures.
A novel SNN training framework is proposed, namely layer-wise ANN-to-SNN knowledge distillation (LaSNN)
arXiv Detail & Related papers (2023-04-17T03:49:35Z) - A Hybrid ANN-SNN Architecture for Low-Power and Low-Latency Visual Perception [27.144985031646932]
Spiking Neural Networks (SNN) are a class of bio-inspired neural networks that promise to bring low-power and low-latency inference to edge devices.
We show for the task of event-based 2D and 3D human pose estimation that our method consumes 88% less power with only a 4% decrease in performance compared to its fully ANN counterparts.
arXiv Detail & Related papers (2023-03-24T17:38:45Z) - Hybrid Spiking Neural Network Fine-tuning for Hippocampus Segmentation [3.1247096708403914]
Spiking neural networks (SNNs) have emerged as a low-power alternative to artificial neural networks (ANNs)
In this work, we propose a hybrid SNN training scheme and apply it to segment human hippocampi from magnetic resonance images.
arXiv Detail & Related papers (2023-02-14T20:18:57Z) - Spikeformer: A Novel Architecture for Training High-Performance
Low-Latency Spiking Neural Network [6.8125324121155275]
We propose a novel Transformer-based SNN,termed "Spikeformer",which outperforms its ANN counterpart on both static dataset and neuromorphic dataset.
Remarkably,our Spikeformer outperforms other SNNs on ImageNet by a large margin (i.e.more than 5%) and even outperforms its ANN counterpart by 3.1% and 2.2% on DVS-Gesture and ImageNet.
arXiv Detail & Related papers (2022-11-19T12:49:22Z) - SNN2ANN: A Fast and Memory-Efficient Training Framework for Spiking
Neural Networks [117.56823277328803]
Spiking neural networks are efficient computation models for low-power environments.
We propose a SNN-to-ANN (SNN2ANN) framework to train the SNN in a fast and memory-efficient way.
Experiment results show that our SNN2ANN-based models perform well on the benchmark datasets.
arXiv Detail & Related papers (2022-06-19T16:52:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Kernel Based Progressive Distillation for Adder Neural Networks [71.731127378807]
Adder Neural Networks (ANNs) which only contain additions bring us a new way of developing deep neural networks with low energy consumption.
There is an accuracy drop when replacing all convolution filters by adder filters.
We present a novel method for further improving the performance of ANNs without increasing the trainable parameters.
arXiv Detail & Related papers (2020-09-28T03:29:19Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Training Deep Spiking Neural Networks [0.0]
Brain-inspired spiking neural networks (SNNs) with neuromorphic hardware may offer orders of magnitude higher energy efficiency.
We show that is is possible to train SNN with ResNet50 architecture on CIFAR100 and Imagenette object recognition datasets.
The trained SNN falls behind in accuracy compared to analogous ANN but requires several orders of magnitude less inference time steps.
arXiv Detail & Related papers (2020-06-08T09:47:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.