Spiking CapsNet: A Spiking Neural Network With A Biologically Plausible
Routing Rule Between Capsules
- URL: http://arxiv.org/abs/2111.07785v1
- Date: Mon, 15 Nov 2021 14:23:15 GMT
- Title: Spiking CapsNet: A Spiking Neural Network With A Biologically Plausible
Routing Rule Between Capsules
- Authors: Dongcheng Zhao, Yang Li, Yi Zeng, Jihang Wang, Qian Zhang
- Abstract summary: Spiking neural network (SNN) has attracted much attention due to their powerful-temporal information representation ability.
CapsNet does well in assembling and coupling different levels.
We propose Spiking CapsNet by introducing the capsules into the modelling of neural networks.
- Score: 9.658836348699161
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking neural network (SNN) has attracted much attention due to their
powerful spatio-temporal information representation ability. Capsule Neural
Network (CapsNet) does well in assembling and coupling features at different
levels. Here, we propose Spiking CapsNet by introducing the capsules into the
modelling of spiking neural networks. In addition, we propose a more
biologically plausible Spike Timing Dependent Plasticity routing mechanism. By
fully considering the spatio-temporal relationship between the low-level
spiking capsules and the high-level spiking capsules, the coupling ability
between them is further improved. We have verified experiments on the MNIST and
FashionMNIST datasets. Compared with other excellent SNN models, our algorithm
still achieves high performance. Our Spiking CapsNet fully combines the
strengthens of SNN and CapsNet, and shows strong robustness to noise and affine
transformation. By adding different Salt-Pepper and Gaussian noise to the test
dataset, the experimental results demonstrate that our Spiking CapsNet shows a
more robust performance when there is more noise, while the artificial neural
network can not correctly clarify. As well, our Spiking CapsNet shows strong
generalization to affine transformation on the AffNIST dataset.
Related papers
- Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Capsule Neural Networks as Noise Stabilizer for Time Series Data [20.29049860598735]
Capsule Neural Networks utilize capsules, which bind neurons into a single vector and learn position equivariant features.
In this paper, we investigate the effectiveness of CapsNets in analyzing highly sensitive and noisy time series sensor data.
arXiv Detail & Related papers (2024-03-20T12:17:49Z) - Spikformer V2: Join the High Accuracy Club on ImageNet with an SNN
Ticket [81.9471033944819]
Spiking Neural Networks (SNNs) face the challenge of limited performance.
Self-attention mechanism, which is the cornerstone of the high-performance Transformer, is absent in existing SNNs.
We propose a novel Spiking Self-Attention (SSA) and Spiking Transformer (Spikformer)
arXiv Detail & Related papers (2024-01-04T01:33:33Z) - Deep Multi-Threshold Spiking-UNet for Image Processing [51.88730892920031]
This paper introduces the novel concept of Spiking-UNet for image processing, which combines the power of Spiking Neural Networks (SNNs) with the U-Net architecture.
To achieve an efficient Spiking-UNet, we face two primary challenges: ensuring high-fidelity information propagation through the network via spikes and formulating an effective training strategy.
Experimental results show that, on image segmentation and denoising, our Spiking-UNet achieves comparable performance to its non-spiking counterpart.
arXiv Detail & Related papers (2023-07-20T16:00:19Z) - Spikingformer: Spike-driven Residual Learning for Transformer-based
Spiking Neural Network [19.932683405796126]
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks.
SNNs suffer from non-spike computations caused by the structure of their residual connection.
We develop Spikingformer, a pure transformer-based spiking neural network.
arXiv Detail & Related papers (2023-04-24T09:44:24Z) - RobCaps: Evaluating the Robustness of Capsule Networks against Affine
Transformations and Adversarial Attacks [11.302789770501303]
Capsule Networks (CapsNets) are able to hierarchically preserve the pose relationships between multiple objects for image classification tasks.
In this paper, we evaluate different factors affecting the robustness of CapsNets, compared to traditional Conal Neural Networks (CNNs)
arXiv Detail & Related papers (2023-04-08T09:58:35Z) - Exploiting High Performance Spiking Neural Networks with Efficient
Spiking Patterns [4.8416725611508244]
Spiking Neural Networks (SNNs) use discrete spike sequences to transmit information, which significantly mimics the information transmission of the brain.
This paper introduces the dynamic Burst pattern and designs the Leaky Integrate and Fire or Burst (LIFB) neuron that can make a trade-off between short-time performance and dynamic temporal performance.
arXiv Detail & Related papers (2023-01-29T04:22:07Z) - Spikformer: When Spiking Neural Network Meets Transformer [102.91330530210037]
We consider two biologically plausible structures, the Spiking Neural Network (SNN) and the self-attention mechanism.
We propose a novel Spiking Self Attention (SSA) as well as a powerful framework, named Spiking Transformer (Spikformer)
arXiv Detail & Related papers (2022-09-29T14:16:49Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Parallel Capsule Networks for Classification of White Blood Cells [1.5749416770494706]
Capsule Networks (CapsNets) is a machine learning architecture proposed to overcome some of the shortcomings of convolutional neural networks (CNNs)
We present a new architecture, parallel CapsNets, which exploits the concept of branching the network to isolate certain capsules.
arXiv Detail & Related papers (2021-08-05T14:30:44Z) - Subspace Capsule Network [85.69796543499021]
SubSpace Capsule Network (SCN) exploits the idea of capsule networks to model possible variations in the appearance or implicitly defined properties of an entity.
SCN can be applied to both discriminative and generative models without incurring computational overhead compared to CNN during test time.
arXiv Detail & Related papers (2020-02-07T17:51:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.