Spiking Diffusion Models
- URL: http://arxiv.org/abs/2408.16467v1
- Date: Thu, 29 Aug 2024 11:56:02 GMT
- Title: Spiking Diffusion Models
- Authors: Jiahang Cao, Hanzhong Guo, Ziqing Wang, Deming Zhou, Hao Cheng, Qiang Zhang, Renjing Xu,
- Abstract summary: Spiking Neural Networks (SNNs) have gained attention for their ultra-low energy consumption and high biological plausibility.
Despite their distinguished properties, the application of SNNs in the computationally intensive field of image generation is still under exploration.
We propose the Spiking Diffusion Models (SDMs), an innovative family of SNN-based generative models that excel in producing high-quality samples with significantly reduced energy consumption.
- Score: 9.90242879469799
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent years have witnessed Spiking Neural Networks (SNNs) gaining attention for their ultra-low energy consumption and high biological plausibility compared with traditional Artificial Neural Networks (ANNs). Despite their distinguished properties, the application of SNNs in the computationally intensive field of image generation is still under exploration. In this paper, we propose the Spiking Diffusion Models (SDMs), an innovative family of SNN-based generative models that excel in producing high-quality samples with significantly reduced energy consumption. In particular, we propose a Temporal-wise Spiking Mechanism (TSM) that allows SNNs to capture more temporal features from a bio-plasticity perspective. In addition, we propose a threshold-guided strategy that can further improve the performances by up to 16.7% without any additional training. We also make the first attempt to use the ANN-SNN approach for SNN-based generation tasks. Extensive experimental results reveal that our approach not only exhibits comparable performance to its ANN counterpart with few spiking time steps, but also outperforms previous SNN-based generative models by a large margin. Moreover, we also demonstrate the high-quality generation ability of SDM on large-scale datasets, e.g., LSUN bedroom. This development marks a pivotal advancement in the capabilities of SNN-based generation, paving the way for future research avenues to realize low-energy and low-latency generative applications. Our code is available at https://github.com/AndyCao1125/SDM.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - A Hybrid SNN-ANN Network for Event-based Object Detection with Spatial and Temporal Attention [2.5075774828443467]
Event cameras offer high temporal resolution and dynamic range with minimal motion blur, making them promising for object detection tasks.
While Spiking Neural Networks (SNNs) are a natural match for event-based sensory data, Artificial Neural Networks (ANNs) tend to display more stable training dynamics.
We introduce the first Hybrid Attention-based SNN-ANN backbone for object detection using event cameras.
arXiv Detail & Related papers (2024-03-15T10:28:31Z) - SDiT: Spiking Diffusion Model with Transformer [1.7630597106970465]
Spiking neural networks (SNNs) have low power consumption and bio-interpretable characteristics.
We utilize transformer to replace the commonly used U-net structure in mainstream diffusion models.
It can generate higher quality images with relatively lower computational cost and shorter sampling time.
arXiv Detail & Related papers (2024-02-18T13:42:11Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - Spiking Denoising Diffusion Probabilistic Models [11.018937744626387]
Spiking neural networks (SNNs) have ultra-low energy consumption and high biological plausibility.
We propose Spiking Denoising Diffusion Probabilistic Models (SDDPM), a new class of SNN-based generative models that achieve high sample quality.
Our approach achieves state-of-the-art on the generative tasks and substantially outperforms other SNN-based generative models.
arXiv Detail & Related papers (2023-06-29T15:43:06Z) - Hybrid Spiking Neural Network Fine-tuning for Hippocampus Segmentation [3.1247096708403914]
Spiking neural networks (SNNs) have emerged as a low-power alternative to artificial neural networks (ANNs)
In this work, we propose a hybrid SNN training scheme and apply it to segment human hippocampi from magnetic resonance images.
arXiv Detail & Related papers (2023-02-14T20:18:57Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Fully Spiking Variational Autoencoder [66.58310094608002]
Spiking neural networks (SNNs) can be run on neuromorphic devices with ultra-high speed and ultra-low energy consumption.
In this study, we build a variational autoencoder (VAE) with SNN to enable image generation.
arXiv Detail & Related papers (2021-09-26T06:10:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.