Spiking Denoising Diffusion Probabilistic Models
- URL: http://arxiv.org/abs/2306.17046v4
- Date: Tue, 5 Dec 2023 09:23:25 GMT
- Title: Spiking Denoising Diffusion Probabilistic Models
- Authors: Jiahang Cao, Ziqing Wang, Hanzhong Guo, Hao Cheng, Qiang Zhang,
Renjing Xu
- Abstract summary: Spiking neural networks (SNNs) have ultra-low energy consumption and high biological plausibility.
We propose Spiking Denoising Diffusion Probabilistic Models (SDDPM), a new class of SNN-based generative models that achieve high sample quality.
Our approach achieves state-of-the-art on the generative tasks and substantially outperforms other SNN-based generative models.
- Score: 11.018937744626387
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) have ultra-low energy consumption and high
biological plausibility due to their binary and bio-driven nature compared with
artificial neural networks (ANNs). While previous research has primarily
focused on enhancing the performance of SNNs in classification tasks, the
generative potential of SNNs remains relatively unexplored. In our paper, we
put forward Spiking Denoising Diffusion Probabilistic Models (SDDPM), a new
class of SNN-based generative models that achieve high sample quality. To fully
exploit the energy efficiency of SNNs, we propose a purely Spiking U-Net
architecture, which achieves comparable performance to its ANN counterpart
using only 4 time steps, resulting in significantly reduced energy consumption.
Extensive experimental results reveal that our approach achieves
state-of-the-art on the generative tasks and substantially outperforms other
SNN-based generative models, achieving up to 12x and 6x improvement on the
CIFAR-10 and the CelebA datasets, respectively. Moreover, we propose a
threshold-guided strategy that can further improve the performances by 2.69% in
a training-free manner. The SDDPM symbolizes a significant advancement in the
field of SNN generation, injecting new perspectives and potential avenues of
exploration. Our code is available at https://github.com/AndyCao1125/SDDPM.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Spiking Diffusion Models [9.90242879469799]
Spiking Neural Networks (SNNs) have gained attention for their ultra-low energy consumption and high biological plausibility.
Despite their distinguished properties, the application of SNNs in the computationally intensive field of image generation is still under exploration.
We propose the Spiking Diffusion Models (SDMs), an innovative family of SNN-based generative models that excel in producing high-quality samples with significantly reduced energy consumption.
arXiv Detail & Related papers (2024-08-29T11:56:02Z) - NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - SDiT: Spiking Diffusion Model with Transformer [1.7630597106970465]
Spiking neural networks (SNNs) have low power consumption and bio-interpretable characteristics.
We utilize transformer to replace the commonly used U-net structure in mainstream diffusion models.
It can generate higher quality images with relatively lower computational cost and shorter sampling time.
arXiv Detail & Related papers (2024-02-18T13:42:11Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - Enabling energy-Efficient object detection with surrogate gradient
descent in spiking neural networks [0.40054215937601956]
Spiking Neural Networks (SNNs) are a biologically plausible neural network model with significant advantages in both event-driven processing and processing-temporal information.
In this study, we introduce the Current Mean Decoding (CMD) method, which solves the regression problem to facilitate the training of deep SNNs for object detection tasks.
Based on the gradient surrogate and CMD, we propose the SNN-YOLOv3 model for object detection.
arXiv Detail & Related papers (2023-09-07T15:48:00Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Skip Connections in Spiking Neural Networks: An Analysis of Their Effect
on Network Training [0.8602553195689513]
Spiking neural networks (SNNs) have gained attention as a promising alternative to traditional artificial neural networks (ANNs)
In this paper, we study the impact of skip connections on SNNs and propose a hyper parameter optimization technique that adapts models from ANN to SNN.
We demonstrate that optimizing the position, type, and number of skip connections can significantly improve the accuracy and efficiency of SNNs.
arXiv Detail & Related papers (2023-03-23T07:57:32Z) - Spikformer: When Spiking Neural Network Meets Transformer [102.91330530210037]
We consider two biologically plausible structures, the Spiking Neural Network (SNN) and the self-attention mechanism.
We propose a novel Spiking Self Attention (SSA) as well as a powerful framework, named Spiking Transformer (Spikformer)
arXiv Detail & Related papers (2022-09-29T14:16:49Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.