Fully Spiking Denoising Diffusion Implicit Models
- URL: http://arxiv.org/abs/2312.01742v1
- Date: Mon, 4 Dec 2023 09:07:09 GMT
- Title: Fully Spiking Denoising Diffusion Implicit Models
- Authors: Ryo Watanabe, Yusuke Mukuta and Tatsuya Harada
- Abstract summary: Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
- Score: 61.32076130121347
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) have garnered considerable attention owing to
their ability to run on neuromorphic devices with super-high speeds and
remarkable energy efficiencies. SNNs can be used in conventional neural
network-based time- and energy-consuming applications. However, research on
generative models within SNNs remains limited, despite their advantages. In
particular, diffusion models are a powerful class of generative models, whose
image generation quality surpass that of the other generative models, such as
GANs. However, diffusion models are characterized by high computational costs
and long inference times owing to their iterative denoising feature. Therefore,
we propose a novel approach fully spiking denoising diffusion implicit model
(FSDDIM) to construct a diffusion model within SNNs and leverage the high speed
and low energy consumption features of SNNs via synaptic current learning
(SCL). SCL fills the gap in that diffusion models use a neural network to
estimate real-valued parameters of a predefined probabilistic distribution,
whereas SNNs output binary spike trains. The SCL enables us to complete the
entire generative process of diffusion models exclusively using SNNs. We
demonstrate that the proposed method outperforms the state-of-the-art fully
spiking generative model.
Related papers
- Spiking Diffusion Models [9.90242879469799]
Spiking Neural Networks (SNNs) have gained attention for their ultra-low energy consumption and high biological plausibility.
Despite their distinguished properties, the application of SNNs in the computationally intensive field of image generation is still under exploration.
We propose the Spiking Diffusion Models (SDMs), an innovative family of SNN-based generative models that excel in producing high-quality samples with significantly reduced energy consumption.
arXiv Detail & Related papers (2024-08-29T11:56:02Z) - Neural Network Parameter Diffusion [50.85251415173792]
Diffusion models have achieved remarkable success in image and video generation.
In this work, we demonstrate that diffusion models can also.
generate high-performing neural network parameters.
arXiv Detail & Related papers (2024-02-20T16:59:03Z) - SDiT: Spiking Diffusion Model with Transformer [1.7630597106970465]
Spiking neural networks (SNNs) have low power consumption and bio-interpretable characteristics.
We utilize transformer to replace the commonly used U-net structure in mainstream diffusion models.
It can generate higher quality images with relatively lower computational cost and shorter sampling time.
arXiv Detail & Related papers (2024-02-18T13:42:11Z) - Unleashing the Potential of Spiking Neural Networks for Sequential
Modeling with Contextual Embedding [32.25788551849627]
Brain-inspired spiking neural networks (SNNs) have struggled to match their biological counterpart in modeling long-term temporal relationships.
This paper presents a novel Contextual Embedding Leaky Integrate-and-Fire (CE-LIF) spiking neuron model.
arXiv Detail & Related papers (2023-08-29T09:33:10Z) - Spiking Denoising Diffusion Probabilistic Models [11.018937744626387]
Spiking neural networks (SNNs) have ultra-low energy consumption and high biological plausibility.
We propose Spiking Denoising Diffusion Probabilistic Models (SDDPM), a new class of SNN-based generative models that achieve high sample quality.
Our approach achieves state-of-the-art on the generative tasks and substantially outperforms other SNN-based generative models.
arXiv Detail & Related papers (2023-06-29T15:43:06Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - EINNs: Epidemiologically-Informed Neural Networks [75.34199997857341]
We introduce a new class of physics-informed neural networks-EINN-crafted for epidemic forecasting.
We investigate how to leverage both the theoretical flexibility provided by mechanistic models as well as the data-driven expressability afforded by AI models.
arXiv Detail & Related papers (2022-02-21T18:59:03Z) - Exploiting Spiking Dynamics with Spatial-temporal Feature Normalization
in Graph Learning [9.88508686848173]
Biological spiking neurons with intrinsic dynamics underlie the powerful representation and learning capabilities of the brain.
Despite recent tremendous progress in spiking neural networks (SNNs) for handling Euclidean-space tasks, it still remains challenging to exploit SNNs in processing non-Euclidean-space data.
Here we present a general spike-based modeling framework that enables the direct training of SNNs for graph learning.
arXiv Detail & Related papers (2021-06-30T11:20:16Z) - Sparse Flows: Pruning Continuous-depth Models [107.98191032466544]
We show that pruning improves generalization for neural ODEs in generative modeling.
We also show that pruning finds minimal and efficient neural ODE representations with up to 98% less parameters compared to the original network, without loss of accuracy.
arXiv Detail & Related papers (2021-06-24T01:40:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.