Spike timing reshapes robustness against attacks in spiking neural
networks
- URL: http://arxiv.org/abs/2306.05654v1
- Date: Fri, 9 Jun 2023 03:48:57 GMT
- Title: Spike timing reshapes robustness against attacks in spiking neural
networks
- Authors: Jianhao Ding, Zhaofei Yu, Tiejun Huang and Jian K. Liu
- Abstract summary: spiking neural networks (SNNs) are emerging as a new type of neural network model.
We explore the role of spike timing in SNNs, focusing on the robustness of the system against various types of attacks.
Our results suggest that the utility of spike timing coding in SNNs could improve the robustness against attacks.
- Score: 21.983346771962566
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The success of deep learning in the past decade is partially shrouded in the
shadow of adversarial attacks. In contrast, the brain is far more robust at
complex cognitive tasks. Utilizing the advantage that neurons in the brain
communicate via spikes, spiking neural networks (SNNs) are emerging as a new
type of neural network model, boosting the frontier of theoretical
investigation and empirical application of artificial neural networks and deep
learning. Neuroscience research proposes that the precise timing of neural
spikes plays an important role in the information coding and sensory processing
of the biological brain. However, the role of spike timing in SNNs is less
considered and far from understood. Here we systematically explored the timing
mechanism of spike coding in SNNs, focusing on the robustness of the system
against various types of attacks. We found that SNNs can achieve higher
robustness improvement using the coding principle of precise spike timing in
neural encoding and decoding, facilitated by different learning rules. Our
results suggest that the utility of spike timing coding in SNNs could improve
the robustness against attacks, providing a new approach to reliable coding
principles for developing next-generation brain-inspired deep learning.
Related papers
- Stepwise Weighted Spike Coding for Deep Spiking Neural Networks [7.524721345903027]
Spiking Neural Networks (SNNs) seek to mimic the spiking behavior of biological neurons.
We propose a novel Stepwise Weighted Spike (SWS) coding scheme to enhance the encoding of information in spikes.
This approach compresses the spikes by weighting the significance of the spike in each step of neural computation, achieving high performance and low energy consumption.
arXiv Detail & Related papers (2024-08-30T12:39:25Z) - Stochastic Spiking Neural Networks with First-to-Spike Coding [7.955633422160267]
Spiking Neural Networks (SNNs) are known for their bio-plausibility and energy efficiency.
In this work, we explore the merger of novel computing and information encoding schemes in SNN architectures.
We investigate the tradeoffs of our proposal in terms of accuracy, inference latency, spiking sparsity, energy consumption, and datasets.
arXiv Detail & Related papers (2024-04-26T22:52:23Z) - Curriculum Design Helps Spiking Neural Networks to Classify Time Series [16.402675046686834]
Spiking Neural Networks (SNNs) have a greater potential for modeling time series data than Artificial Neural Networks (ANNs)
In this work, enlighten by brain-inspired science, we find that, not only the structure but also the learning process should be human-like.
arXiv Detail & Related papers (2023-12-26T02:04:53Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Exploiting Noise as a Resource for Computation and Learning in Spiking
Neural Networks [32.0086664373154]
This study introduces the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL)
NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation.
arXiv Detail & Related papers (2023-05-25T13:21:26Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - Deep Reinforcement Learning with Spiking Q-learning [51.386945803485084]
spiking neural networks (SNNs) are expected to realize artificial intelligence (AI) with less energy consumption.
It provides a promising energy-efficient way for realistic control tasks by combining SNNs with deep reinforcement learning (RL)
arXiv Detail & Related papers (2022-01-21T16:42:11Z) - Artificial Neural Variability for Deep Learning: On Overfitting, Noise
Memorization, and Catastrophic Forgetting [135.0863818867184]
artificial neural variability (ANV) helps artificial neural networks learn some advantages from natural'' neural networks.
ANV plays as an implicit regularizer of the mutual information between the training data and the learned model.
It can effectively relieve overfitting, label noise memorization, and catastrophic forgetting at negligible costs.
arXiv Detail & Related papers (2020-11-12T06:06:33Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.