Spiking PointNet: Spiking Neural Networks for Point Clouds
- URL: http://arxiv.org/abs/2310.06232v1
- Date: Tue, 10 Oct 2023 00:59:26 GMT
- Title: Spiking PointNet: Spiking Neural Networks for Point Clouds
- Authors: Dayong Ren, Zhe Ma, Yuanpei Chen, Weihang Peng, Xiaode Liu, Yuhan
Zhang, Yufei Guo
- Abstract summary: Spiking PointNet is the first spiking neural model for efficient deep learning on point clouds.
Our Spiking PointNet is trained with only a single time step but can obtain better performance with multiple time steps inference.
Notably, our Spiking PointNet even can outperform its ANN counterpart, which is rare in the SNN field.
- Score: 15.934361665062584
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, Spiking Neural Networks (SNNs), enjoying extreme energy efficiency,
have drawn much research attention on 2D visual recognition and shown gradually
increasing application potential. However, it still remains underexplored
whether SNNs can be generalized to 3D recognition. To this end, we present
Spiking PointNet in the paper, the first spiking neural model for efficient
deep learning on point clouds. We discover that the two huge obstacles limiting
the application of SNNs in point clouds are: the intrinsic optimization
obstacle of SNNs that impedes the training of a big spiking model with large
time steps, and the expensive memory and computation cost of PointNet that
makes training a big spiking point model unrealistic. To solve the problems
simultaneously, we present a trained-less but learning-more paradigm for
Spiking PointNet with theoretical justifications and in-depth experimental
analysis. In specific, our Spiking PointNet is trained with only a single time
step but can obtain better performance with multiple time steps inference,
compared to the one trained directly with multiple time steps. We conduct
various experiments on ModelNet10, ModelNet40 to demonstrate the effectiveness
of Spiking PointNet. Notably, our Spiking PointNet even can outperform its ANN
counterpart, which is rare in the SNN field thus providing a potential research
direction for the following work. Moreover, Spiking PointNet shows impressive
speedup and storage saving in the training phase.
Related papers
- Spiking CenterNet: A Distillation-boosted Spiking Neural Network for Object Detection [15.043707655842592]
Spiking Neural Networks (SNNs) are a promising approach to address this challenge.
We propose Spiking CenterNet for object detection on event data.
Our work is the first approach that takes advantage of knowledge distillation in the field of spiking object detection.
arXiv Detail & Related papers (2024-02-02T10:23:03Z) - Rethinking Residual Connection in Training Large-Scale Spiking Neural
Networks [10.286425749417216]
Spiking Neural Network (SNN) is known as the most famous brain-inspired model.
Non-differentiable spiking mechanism makes it hard to train large-scale SNNs.
arXiv Detail & Related papers (2023-11-09T06:48:29Z) - Deep Multi-Threshold Spiking-UNet for Image Processing [51.88730892920031]
This paper introduces the novel concept of Spiking-UNet for image processing, which combines the power of Spiking Neural Networks (SNNs) with the U-Net architecture.
To achieve an efficient Spiking-UNet, we face two primary challenges: ensuring high-fidelity information propagation through the network via spikes and formulating an effective training strategy.
Experimental results show that, on image segmentation and denoising, our Spiking-UNet achieves comparable performance to its non-spiking counterpart.
arXiv Detail & Related papers (2023-07-20T16:00:19Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - PointAttN: You Only Need Attention for Point Cloud Completion [89.88766317412052]
Point cloud completion refers to completing 3D shapes from partial 3D point clouds.
We propose a novel neural network for processing point cloud in a per-point manner to eliminate kNNs.
The proposed framework, namely PointAttN, is simple, neat and effective, which can precisely capture the structural information of 3D shapes.
arXiv Detail & Related papers (2022-03-16T09:20:01Z) - FatNet: A Feature-attentive Network for 3D Point Cloud Processing [1.502579291513768]
We introduce a novel feature-attentive neural network layer, a FAT layer, that combines both global point-based features and local edge-based features in order to generate better embeddings.
Our architecture achieves state-of-the-art results on the task of point cloud classification, as demonstrated on the ModelNet40 dataset.
arXiv Detail & Related papers (2021-04-07T23:13:56Z) - NetAdaptV2: Efficient Neural Architecture Search with Fast Super-Network
Training and Architecture Optimization [15.63765190153914]
We present NetAdaptV2 with three innovations to better balance the time spent for each step while supporting non-differentiable search metrics.
First, we propose channel-level bypass connections that merge network depth and layer width into a single search dimension.
Second, ordered dropout is proposed to train multiple DNNs in a single forward-backward pass to decrease the time for training a super-network.
arXiv Detail & Related papers (2021-03-31T18:03:46Z) - S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural
Networks via Guided Distribution Calibration [74.5509794733707]
We present a novel guided learning paradigm from real-valued to distill binary networks on the final prediction distribution.
Our proposed method can boost the simple contrastive learning baseline by an absolute gain of 5.515% on BNNs.
Our method achieves substantial improvement over the simple contrastive learning baseline, and is even comparable to many mainstream supervised BNN methods.
arXiv Detail & Related papers (2021-02-17T18:59:28Z) - Attentive Graph Neural Networks for Few-Shot Learning [74.01069516079379]
Graph Neural Networks (GNN) has demonstrated the superior performance in many challenging applications, including the few-shot learning tasks.
Despite its powerful capacity to learn and generalize the model from few samples, GNN usually suffers from severe over-fitting and over-smoothing as the model becomes deep.
We propose a novel Attentive GNN to tackle these challenges, by incorporating a triple-attention mechanism.
arXiv Detail & Related papers (2020-07-14T07:43:09Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.