PrivateSNN: Fully Privacy-Preserving Spiking Neural Networks
- URL: http://arxiv.org/abs/2104.03414v1
- Date: Wed, 7 Apr 2021 22:14:02 GMT
- Title: PrivateSNN: Fully Privacy-Preserving Spiking Neural Networks
- Authors: Youngeun Kim, Yeshwanth Venkatesha and Priyadarshini Panda
- Abstract summary: PrivateSNN aims to build low-power Spiking Neural Networks (SNNs) from a pre-trained ANN model without leaking sensitive information contained in a dataset.
We tackle two types of leakage problems: data leakage caused when the networks access real training data during an ANN-SNN conversion process.
In order to address the data leakage issue, we generate synthetic images from the pre-trained ANNs and convert ANNs to SNNs using generated images.
We observe that the encrypted PrivateSNN can be implemented not only without the huge performance drop but also with significant energy
- Score: 6.336941090564427
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: How can we bring both privacy and energy-efficiency to a neural system on
edge devices? In this paper, we propose PrivateSNN, which aims to build
low-power Spiking Neural Networks (SNNs) from a pre-trained ANN model without
leaking sensitive information contained in a dataset. Here, we tackle two types
of leakage problems: 1) Data leakage caused when the networks access real
training data during an ANN-SNN conversion process. 2) Class leakage is the
concept of leakage caused when class-related features can be reconstructed from
network parameters. In order to address the data leakage issue, we generate
synthetic images from the pre-trained ANNs and convert ANNs to SNNs using
generated images. However, converted SNNs are still vulnerable with respect to
the class leakage since the weight parameters have the same (or scaled) value
with respect to ANN parameters. Therefore, we encrypt SNN weights by training
SNNs with a temporal spike-based learning rule. Updating weight parameters with
temporal data makes networks difficult to be interpreted in the spatial domain.
We observe that the encrypted PrivateSNN can be implemented not only without
the huge performance drop (less than ~5%) but also with significant
energy-efficiency gain (about x60 compared to the standard ANN). We conduct
extensive experiments on various datasets including CIFAR10, CIFAR100, and
TinyImageNet, highlighting the importance of privacy-preserving SNN training.
Related papers
- Are Neuromorphic Architectures Inherently Privacy-preserving? An Exploratory Study [3.4673556247932225]
Spiking Neural Networks (SNNs) are emerging as promising alternatives to Artificial Neural Networks (ANNs)
This paper examines whether SNNs inherently offer better privacy.
We analyze the impact of learning algorithms (surrogate gradient and evolutionary), frameworks (snnTorch, TENNLab, LAVA), and parameters on SNN privacy.
arXiv Detail & Related papers (2024-11-10T22:18:53Z) - NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - Optimal ANN-SNN Conversion for High-accuracy and Ultra-low-latency
Spiking Neural Networks [22.532709609646066]
Spiking Neural Networks (SNNs) have gained great attraction due to their distinctive properties of low power consumption and fast inference on neuromorphic hardware.
As the most effective method to get deep SNNs, ANN-SNN conversion has achieved comparable performance as ANNs on large-scale datasets.
In this paper, we theoretically analyze ANN-SNN conversion error and derive the estimated activation function of SNNs.
We prove that the expected conversion error between SNNs and ANNs is zero, enabling us to achieve high-accuracy and ultra-low-latency SNN
arXiv Detail & Related papers (2023-03-08T03:04:53Z) - Spikeformer: A Novel Architecture for Training High-Performance
Low-Latency Spiking Neural Network [6.8125324121155275]
We propose a novel Transformer-based SNN,termed "Spikeformer",which outperforms its ANN counterpart on both static dataset and neuromorphic dataset.
Remarkably,our Spikeformer outperforms other SNNs on ImageNet by a large margin (i.e.more than 5%) and even outperforms its ANN counterpart by 3.1% and 2.2% on DVS-Gesture and ImageNet.
arXiv Detail & Related papers (2022-11-19T12:49:22Z) - SNN2ANN: A Fast and Memory-Efficient Training Framework for Spiking
Neural Networks [117.56823277328803]
Spiking neural networks are efficient computation models for low-power environments.
We propose a SNN-to-ANN (SNN2ANN) framework to train the SNN in a fast and memory-efficient way.
Experiment results show that our SNN2ANN-based models perform well on the benchmark datasets.
arXiv Detail & Related papers (2022-06-19T16:52:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Fully Spiking Variational Autoencoder [66.58310094608002]
Spiking neural networks (SNNs) can be run on neuromorphic devices with ultra-high speed and ultra-low energy consumption.
In this study, we build a variational autoencoder (VAE) with SNN to enable image generation.
arXiv Detail & Related papers (2021-09-26T06:10:14Z) - Spiking Neural Networks with Single-Spike Temporal-Coded Neurons for
Network Intrusion Detection [6.980076213134383]
Spiking neural network (SNN) is interesting due to its strong bio-plausibility and high energy efficiency.
However, its performance is falling far behind conventional deep neural networks (DNNs)
arXiv Detail & Related papers (2020-10-15T14:46:18Z) - Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey [77.99182201815763]
Deep Neural Networks (DNNs) achieve state-of-the-art results in many different problem settings.
DNNs are often treated as black box systems, which complicates their evaluation and validation.
One promising field, inspired by the success of convolutional neural networks (CNNs) in computer vision tasks, is to incorporate knowledge about symmetric geometrical transformations.
arXiv Detail & Related papers (2020-06-30T14:56:05Z) - Training Deep Spiking Neural Networks [0.0]
Brain-inspired spiking neural networks (SNNs) with neuromorphic hardware may offer orders of magnitude higher energy efficiency.
We show that is is possible to train SNN with ResNet50 architecture on CIFAR100 and Imagenette object recognition datasets.
The trained SNN falls behind in accuracy compared to analogous ANN but requires several orders of magnitude less inference time steps.
arXiv Detail & Related papers (2020-06-08T09:47:05Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.