SDiT: Spiking Diffusion Model with Transformer
- URL: http://arxiv.org/abs/2402.11588v2
- Date: Sat, 24 Feb 2024 07:24:09 GMT
- Title: SDiT: Spiking Diffusion Model with Transformer
- Authors: Shu Yang, Hanzhi Ma, Chengting Yu, Aili Wang, Er-Ping Li
- Abstract summary: Spiking neural networks (SNNs) have low power consumption and bio-interpretable characteristics.
We utilize transformer to replace the commonly used U-net structure in mainstream diffusion models.
It can generate higher quality images with relatively lower computational cost and shorter sampling time.
- Score: 1.7630597106970465
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) have low power consumption and
bio-interpretable characteristics, and are considered to have tremendous
potential for energy-efficient computing. However, the exploration of SNNs on
image generation tasks remains very limited, and a unified and effective
structure for SNN-based generative models has yet to be proposed. In this
paper, we explore a novel diffusion model architecture within spiking neural
networks. We utilize transformer to replace the commonly used U-net structure
in mainstream diffusion models. It can generate higher quality images with
relatively lower computational cost and shorter sampling time. It aims to
provide an empirical baseline for research of generative models based on SNNs.
Experiments on MNIST, Fashion-MNIST, and CIFAR-10 datasets demonstrate that our
work is highly competitive compared to existing SNN generative models.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Spiking Diffusion Models [9.90242879469799]
Spiking Neural Networks (SNNs) have gained attention for their ultra-low energy consumption and high biological plausibility.
Despite their distinguished properties, the application of SNNs in the computationally intensive field of image generation is still under exploration.
We propose the Spiking Diffusion Models (SDMs), an innovative family of SNN-based generative models that excel in producing high-quality samples with significantly reduced energy consumption.
arXiv Detail & Related papers (2024-08-29T11:56:02Z) - Towards Efficient Deployment of Hybrid SNNs on Neuromorphic and Edge AI Hardware [0.493599216374976]
This paper explores the synergistic potential of neuromorphic and edge computing to create a versatile machine learning (ML) system tailored for processing data captured by dynamic vision sensors.
We construct and train hybrid models, blending spiking neural networks (SNNs) and artificial neural networks (ANNs) using PyTorch and Lava frameworks.
arXiv Detail & Related papers (2024-07-11T17:40:39Z) - Learning Long Sequences in Spiking Neural Networks [0.0]
Spiking neural networks (SNNs) take inspiration from the brain to enable energy-efficient computations.
Recent interest in efficient alternatives to Transformers has given rise to state-of-the-art recurrent architectures named state space models (SSMs)
arXiv Detail & Related papers (2023-12-14T13:30:27Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - Spiking Denoising Diffusion Probabilistic Models [11.018937744626387]
Spiking neural networks (SNNs) have ultra-low energy consumption and high biological plausibility.
We propose Spiking Denoising Diffusion Probabilistic Models (SDDPM), a new class of SNN-based generative models that achieve high sample quality.
Our approach achieves state-of-the-art on the generative tasks and substantially outperforms other SNN-based generative models.
arXiv Detail & Related papers (2023-06-29T15:43:06Z) - Spikformer: When Spiking Neural Network Meets Transformer [102.91330530210037]
We consider two biologically plausible structures, the Spiking Neural Network (SNN) and the self-attention mechanism.
We propose a novel Spiking Self Attention (SSA) as well as a powerful framework, named Spiking Transformer (Spikformer)
arXiv Detail & Related papers (2022-09-29T14:16:49Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Fully Spiking Variational Autoencoder [66.58310094608002]
Spiking neural networks (SNNs) can be run on neuromorphic devices with ultra-high speed and ultra-low energy consumption.
In this study, we build a variational autoencoder (VAE) with SNN to enable image generation.
arXiv Detail & Related papers (2021-09-26T06:10:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.