Noise-Injected Spiking Graph Convolution for Energy-Efficient 3D Point Cloud Denoising
- URL: http://arxiv.org/abs/2502.19660v2
- Date: Thu, 06 Mar 2025 03:14:41 GMT
- Title: Noise-Injected Spiking Graph Convolution for Energy-Efficient 3D Point Cloud Denoising
- Authors: Zikuan Li, Qiaoyun Wu, Jialin Zhang, Kaijun Zhang, Jun Wang,
- Abstract summary: Spiking neural networks (SNNs) have exhibited superior energy efficiency in 2D classification tasks over traditional artificial neural networks (ANNs)<n>We propose noise-injected spiking graph convolutional networks to leverage the full regression potential of SNNs in 3D point cloud denoising.
- Score: 8.698008330627536
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs), inspired by the spiking computation paradigm of the biological neural systems, have exhibited superior energy efficiency in 2D classification tasks over traditional artificial neural networks (ANNs). However, the regression potential of SNNs has not been well explored, especially in 3D point cloud processing. In this paper, we propose noise-injected spiking graph convolutional networks to leverage the full regression potential of SNNs in 3D point cloud denoising. Specifically, we first emulate the noise-injected neuronal dynamics to build noise-injected spiking neurons. On this basis, we design noise-injected spiking graph convolution for promoting disturbance-aware spiking representation learning on 3D points. Starting from the spiking graph convolution, we build two SNN-based denoising networks. One is a purely spiking graph convolutional network, which achieves low accuracy loss compared with some ANN-based alternatives, while resulting in significantly reduced energy consumption on two benchmark datasets, PU-Net and PC-Net. The other is a hybrid architecture that combines ANN-based learning with a high performance-efficiency trade-off in just a few time steps. Our work lights up SNN's potential for 3D point cloud denoising, injecting new perspectives of exploring the deployment on neuromorphic chips while paving the way for developing energy-efficient 3D data acquisition devices.
Related papers
- Spiking Point Transformer for Point Cloud Classification [13.66046704077193]
Spiking Point Transformer (SPT) is the first transformer-based SNN framework for point cloud classification.<n>SPT attains state-of-the-art results on three benchmark datasets that span both real-world and synthetic datasets in the SNN domain.
arXiv Detail & Related papers (2025-02-19T13:28:55Z) - Activation-wise Propagation: A Universal Strategy to Break Timestep Constraints in Spiking Neural Networks for 3D Data Processing [29.279985043923386]
We introduce Activation-wise Membrane Potential Propagation (AMP2), a novel state update mechanism for spiking neurons.
Inspired by skip connections in deep networks, AMP2 incorporates the membrane potential of neurons into network, eliminating the need for iterative updates.
Our method achieves significant improvements across various 3D modalities, including 3D point clouds and event streams.
arXiv Detail & Related papers (2025-02-18T11:52:25Z) - Efficient 3D Recognition with Event-driven Spike Sparse Convolution [15.20476631850388]
Spiking Neural Networks (SNNs) provide an energy-efficient way to extract 3D-temporal features.<n>We introduce the Spike Voxel Coding (SVC) scheme, which encodes the 3D point clouds into a sparse spike train space.<n>We propose a Spike Sparse Convolution (SSC) model for efficiently extracting 3D sparse point cloud features.
arXiv Detail & Related papers (2024-12-10T09:55:15Z) - Graph Neural Networks Provably Benefit from Structural Information: A
Feature Learning Perspective [53.999128831324576]
Graph neural networks (GNNs) have pioneered advancements in graph representation learning.
This study investigates the role of graph convolution within the context of feature learning theory.
arXiv Detail & Related papers (2023-06-24T10:21:11Z) - High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - Desire Backpropagation: A Lightweight Training Algorithm for Multi-Layer
Spiking Neural Networks based on Spike-Timing-Dependent Plasticity [13.384228628766236]
Spiking neural networks (SNNs) are a viable alternative to conventional artificial neural networks.
We present desire backpropagation, a method to derive the desired spike activity of all neurons, including the hidden ones.
We trained three-layer networks to classify MNIST and Fashion-MNIST images and reached an accuracy of 98.41% and 87.56%, respectively.
arXiv Detail & Related papers (2022-11-10T08:32:13Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z) - Optimizing Memory Placement using Evolutionary Graph Reinforcement
Learning [56.83172249278467]
We introduce Evolutionary Graph Reinforcement Learning (EGRL), a method designed for large search spaces.
We train and validate our approach directly on the Intel NNP-I chip for inference.
We additionally achieve 28-78% speed-up compared to the native NNP-I compiler on all three workloads.
arXiv Detail & Related papers (2020-07-14T18:50:12Z) - SiamSNN: Siamese Spiking Neural Networks for Energy-Efficient Object
Tracking [20.595208488431766]
SiamSNN is the first deep SNN tracker that achieves short latency and low precision loss on the visual object tracking benchmarks OTB2013, VOT2016, and GOT-10k.
SiamSNN notably achieves low energy consumption and real-time on Neuromorphic chip TrueNorth.
arXiv Detail & Related papers (2020-03-17T08:49:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.