DPSNN: A Differentially Private Spiking Neural Network with Temporal
Enhanced Pooling
- URL: http://arxiv.org/abs/2205.12718v3
- Date: Tue, 11 Apr 2023 11:44:15 GMT
- Title: DPSNN: A Differentially Private Spiking Neural Network with Temporal
Enhanced Pooling
- Authors: Jihang Wang, Dongcheng Zhao, Guobin Shen, Qian Zhang, Yi Zeng
- Abstract summary: Spiking neural network (SNN), the new generation of artificial neural networks, plays a crucial role in many fields.
This paper combines the differential privacy(DP) algorithm with SNN and proposes a differentially private spiking neural network (DPSNN)
The SNN uses discrete spike sequences to transmit information, combined with the gradient noise introduced by DP so that SNN maintains strong privacy protection.
- Score: 6.63071861272879
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Privacy protection is a crucial issue in machine learning algorithms, and the
current privacy protection is combined with traditional artificial neural
networks based on real values. Spiking neural network (SNN), the new generation
of artificial neural networks, plays a crucial role in many fields. Therefore,
research on the privacy protection of SNN is urgently needed. This paper
combines the differential privacy(DP) algorithm with SNN and proposes a
differentially private spiking neural network (DPSNN). The SNN uses discrete
spike sequences to transmit information, combined with the gradient noise
introduced by DP so that SNN maintains strong privacy protection. At the same
time, to make SNN maintain high performance while obtaining high privacy
protection, we propose the temporal enhanced pooling (TEP) method. It fully
integrates the temporal information of SNN into the spatial information
transfer, which enables SNN to perform better information transfer. We conduct
experiments on static and neuromorphic datasets, and the experimental results
show that our algorithm still maintains high performance while providing strong
privacy protection.
Related papers
- RSC-SNN: Exploring the Trade-off Between Adversarial Robustness and Accuracy in Spiking Neural Networks via Randomized Smoothing Coding [17.342181435229573]
Spiking Neural Networks (SNNs) have received widespread attention due to their unique neuronal dynamics and low-power nature.
Previous research empirically shows that SNNs with Poisson coding are more robust than Artificial Neural Networks (ANNs) on small-scale datasets.
This work theoretically demonstrates that SNN's inherent adversarial robustness stems from its Poisson coding.
arXiv Detail & Related papers (2024-07-29T15:26:15Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - A Homomorphic Encryption Framework for Privacy-Preserving Spiking Neural
Networks [5.274804664403783]
Spiking Neural Networks (SNNs) mimic the behavior of the human brain to improve efficiency and reduce energy consumption.
Homomorphic encryption (HE) offers a solution, allowing calculations to be performed on encrypted data without decrypting it.
This research compares traditional deep neural networks (DNNs) and SNNs using the Brakerski/Fan-Vercauteren (BFV) encryption scheme.
arXiv Detail & Related papers (2023-08-10T15:26:35Z) - Toward Robust Spiking Neural Network Against Adversarial Perturbation [22.56553160359798]
spiking neural networks (SNNs) are deployed increasingly in real-world efficiency critical applications.
Researchers have already demonstrated an SNN can be attacked with adversarial examples.
To the best of our knowledge, this is the first analysis on robust training of SNNs.
arXiv Detail & Related papers (2022-04-12T21:26:49Z) - Noise-Robust Deep Spiking Neural Networks with Temporal Information [22.278159848657754]
Spiking neural networks (SNNs) have emerged as energy-efficient neural networks with temporal information.
SNNs have shown a superior efficiency on neuromorphic devices, but the devices are susceptible to noise, which hinders them from being applied in real-world applications.
In this paper, we investigate the effect of noise on deep SNNs with various neural coding methods and present a noise-robust deep SNN with temporal information.
arXiv Detail & Related papers (2021-04-22T16:40:33Z) - Deep Serial Number: Computational Watermarking for DNN Intellectual
Property Protection [53.40245698216239]
DSN (Deep Serial Number) is a watermarking algorithm designed specifically for deep neural networks (DNNs)
Inspired by serial numbers in safeguarding conventional software IP, we propose the first implementation of serial number embedding within DNNs.
arXiv Detail & Related papers (2020-11-17T21:42:40Z) - Deep Time Delay Neural Network for Speech Enhancement with Full Data
Learning [60.20150317299749]
This paper proposes a deep time delay neural network (TDNN) for speech enhancement with full data learning.
To make full use of the training data, we propose a full data learning method for speech enhancement.
arXiv Detail & Related papers (2020-11-11T06:32:37Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Inherent Adversarial Robustness of Deep Spiking Neural Networks: Effects
of Discrete Input Encoding and Non-Linear Activations [9.092733355328251]
Spiking Neural Network (SNN) is a potential candidate for inherent robustness against adversarial attacks.
In this work, we demonstrate that adversarial accuracy of SNNs under gradient-based attacks is higher than their non-spiking counterparts.
arXiv Detail & Related papers (2020-03-23T17:20:24Z) - CryptoSPN: Privacy-preserving Sum-Product Network Inference [84.88362774693914]
We present a framework for privacy-preserving inference of sum-product networks (SPNs)
CryptoSPN achieves highly efficient and accurate inference in the order of seconds for medium-sized SPNs.
arXiv Detail & Related papers (2020-02-03T14:49:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.