Fully Spiking Variational Autoencoder
- URL: http://arxiv.org/abs/2110.00375v2
- Date: Tue, 5 Oct 2021 07:26:57 GMT
- Title: Fully Spiking Variational Autoencoder
- Authors: Hiromichi Kamata, Yusuke Mukuta, Tatsuya Harada
- Abstract summary: Spiking neural networks (SNNs) can be run on neuromorphic devices with ultra-high speed and ultra-low energy consumption.
In this study, we build a variational autoencoder (VAE) with SNN to enable image generation.
- Score: 66.58310094608002
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking neural networks (SNNs) can be run on neuromorphic devices with
ultra-high speed and ultra-low energy consumption because of their binary and
event-driven nature. Therefore, SNNs are expected to have various applications,
including as generative models being running on edge devices to create
high-quality images. In this study, we build a variational autoencoder (VAE)
with SNN to enable image generation. VAE is known for its stability among
generative models; recently, its quality advanced. In vanilla VAE, the latent
space is represented as a normal distribution, and floating-point calculations
are required in sampling. However, this is not possible in SNNs because all
features must be binary time series data. Therefore, we constructed the latent
space with an autoregressive SNN model, and randomly selected samples from its
output to sample the latent variables. This allows the latent variables to
follow the Bernoulli process and allows variational learning. Thus, we build
the Fully Spiking Variational Autoencoder where all modules are constructed
with SNN. To the best of our knowledge, we are the first to build a VAE only
with SNN layers. We experimented with several datasets, and confirmed that it
can generate images with the same or better quality compared to conventional
ANNs. The code is available at https://github.com/kamata1729/FullySpikingVAE
Related papers
- NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - ANN vs SNN: A case study for Neural Decoding in Implantable
Brain-Machine Interfaces [0.7904805552920349]
In this work, we compare different neural networks (NN) for motor decoding in terms of accuracy and implementation cost.
We further show that combining traditional signal processing techniques with machine learning ones deliver surprisingly good performance even with simple NNs.
arXiv Detail & Related papers (2023-12-26T05:40:39Z) - ESVAE: An Efficient Spiking Variational Autoencoder with Reparameterizable Poisson Spiking Sampling [20.36674120648714]
Variational autoencoders (VAEs) are one of the most popular image generation models.
Current VAE methods implicitly construct the latent space by an elaborated autoregressive network.
We propose an efficient spiking variational autoencoder (ESVAE) that constructs an interpretable latent space distribution.
arXiv Detail & Related papers (2023-10-23T12:01:10Z) - Spiking-Diffusion: Vector Quantized Discrete Diffusion Model with
Spiking Neural Networks [13.586012318909907]
Spiking neural networks (SNNs) have tremendous potential for energy-efficient neuromorphic chips.
We propose a Spiking-Diffusion model, which is based on the vector quantized discrete diffusion model.
Experimental results on MNIST, FMNIST, KMNIST, Letters, and Cifar10 demonstrate that Spiking-Diffusion outperforms the existing SNN-based generation model.
arXiv Detail & Related papers (2023-08-20T07:29:03Z) - Spikeformer: A Novel Architecture for Training High-Performance
Low-Latency Spiking Neural Network [6.8125324121155275]
We propose a novel Transformer-based SNN,termed "Spikeformer",which outperforms its ANN counterpart on both static dataset and neuromorphic dataset.
Remarkably,our Spikeformer outperforms other SNNs on ImageNet by a large margin (i.e.more than 5%) and even outperforms its ANN counterpart by 3.1% and 2.2% on DVS-Gesture and ImageNet.
arXiv Detail & Related papers (2022-11-19T12:49:22Z) - SNN2ANN: A Fast and Memory-Efficient Training Framework for Spiking
Neural Networks [117.56823277328803]
Spiking neural networks are efficient computation models for low-power environments.
We propose a SNN-to-ANN (SNN2ANN) framework to train the SNN in a fast and memory-efficient way.
Experiment results show that our SNN2ANN-based models perform well on the benchmark datasets.
arXiv Detail & Related papers (2022-06-19T16:52:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Sub-bit Neural Networks: Learning to Compress and Accelerate Binary
Neural Networks [72.81092567651395]
Sub-bit Neural Networks (SNNs) are a new type of binary quantization design tailored to compress and accelerate BNNs.
SNNs are trained with a kernel-aware optimization framework, which exploits binary quantization in the fine-grained convolutional kernel space.
Experiments on visual recognition benchmarks and the hardware deployment on FPGA validate the great potentials of SNNs.
arXiv Detail & Related papers (2021-10-18T11:30:29Z) - PrivateSNN: Fully Privacy-Preserving Spiking Neural Networks [6.336941090564427]
PrivateSNN aims to build low-power Spiking Neural Networks (SNNs) from a pre-trained ANN model without leaking sensitive information contained in a dataset.
We tackle two types of leakage problems: data leakage caused when the networks access real training data during an ANN-SNN conversion process.
In order to address the data leakage issue, we generate synthetic images from the pre-trained ANNs and convert ANNs to SNNs using generated images.
We observe that the encrypted PrivateSNN can be implemented not only without the huge performance drop but also with significant energy
arXiv Detail & Related papers (2021-04-07T22:14:02Z) - Event-Based Angular Velocity Regression with Spiking Networks [51.145071093099396]
Spiking Neural Networks (SNNs) process information conveyed as temporal spikes rather than numeric values.
We propose, for the first time, a temporal regression problem of numerical values given events from an event camera.
We show that we can successfully train an SNN to perform angular velocity regression.
arXiv Detail & Related papers (2020-03-05T17:37:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.