Razor SNN: Efficient Spiking Neural Network with Temporal Embeddings
- URL: http://arxiv.org/abs/2306.17597v1
- Date: Fri, 30 Jun 2023 12:17:30 GMT
- Title: Razor SNN: Efficient Spiking Neural Network with Temporal Embeddings
- Authors: Yuan Zhang, Jian Cao, Ling Zhang, Jue Chen, Wenyu Sun, Yuan Wang
- Abstract summary: Event streams generated by dynamic vision sensors (DVS) are sparse and non-uniform in the spatial domain.
We propose an events sparsification spiking framework dubbed as Razor SNN, pruning pointless event frames progressively.
Our Razor SNN achieves competitive performance consistently on four events-based benchmarks.
- Score: 20.048679993279936
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The event streams generated by dynamic vision sensors (DVS) are sparse and
non-uniform in the spatial domain, while still dense and redundant in the
temporal domain. Although spiking neural network (SNN), the event-driven
neuromorphic model, has the potential to extract spatio-temporal features from
the event streams, it is not effective and efficient. Based on the above, we
propose an events sparsification spiking framework dubbed as Razor SNN, pruning
pointless event frames progressively. Concretely, we extend the dynamic
mechanism based on the global temporal embeddings, reconstruct the features,
and emphasize the events effect adaptively at the training stage. During the
inference stage, eliminate fruitless frames hierarchically according to a
binary mask generated by the trained temporal embeddings. Comprehensive
experiments demonstrate that our Razor SNN achieves competitive performance
consistently on four events-based benchmarks: DVS 128 Gesture, N-Caltech 101,
CIFAR10-DVS and SHD.
Related papers
- A dynamic vision sensor object recognition model based on trainable event-driven convolution and spiking attention mechanism [9.745798797360886]
Spiking Neural Networks (SNNs) are well-suited for processing event streams from Dynamic Visual Sensors (DVSs)
To extract features from DVS objects, SNNs commonly use event-driven convolution with fixed kernel parameters.
We propose a DVS object recognition model that utilizes a trainable event-driven convolution and a spiking attention mechanism.
arXiv Detail & Related papers (2024-09-19T12:01:05Z) - Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Temporal Contrastive Learning for Spiking Neural Networks [23.963069990569714]
Biologically inspired neural networks (SNNs) have garnered considerable attention due to their low-energy consumption and better-temporal information processing capabilities.
We propose a novel method to obtain SNNs with low latency and high performance by incorporating contrastive supervision with temporal domain information.
arXiv Detail & Related papers (2023-05-23T10:31:46Z) - STSC-SNN: Spatio-Temporal Synaptic Connection with Temporal Convolution
and Attention for Spiking Neural Networks [7.422913384086416]
Spiking Neural Networks (SNNs), as one of the algorithmic models in neuromorphic computing, have gained a great deal of research attention owing to temporal processing capability.
Existing synaptic structures in SNNs are almost full-connections or spatial 2D convolution, neither which can extract temporal dependencies adequately.
We take inspiration from biological synapses and propose a synaptic connection SNN model, to enhance the synapse-temporal receptive fields of synaptic connections.
We show that endowing synaptic models with temporal dependencies can improve the performance of SNNs on classification tasks.
arXiv Detail & Related papers (2022-10-11T08:13:22Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - AEGNN: Asynchronous Event-based Graph Neural Networks [54.528926463775946]
Event-based Graph Neural Networks generalize standard GNNs to process events as "evolving"-temporal graphs.
AEGNNs are easily trained on synchronous inputs and can be converted to efficient, "asynchronous" networks at test time.
arXiv Detail & Related papers (2022-03-31T16:21:12Z) - Ultra-low Latency Spiking Neural Networks with Spatio-Temporal
Compression and Synaptic Convolutional Block [4.081968050250324]
Spiking neural networks (SNNs) have neuro-temporal information capability, low processing feature, and high biological plausibility.
Neuro-MNIST, CIFAR10-S, DVS128 gesture datasets need to aggregate individual events into frames with a higher temporal resolution for event stream classification.
We propose a processing-temporal compression method to aggregate individual events into a few time steps of NIST current to reduce the training and inference latency.
arXiv Detail & Related papers (2022-03-18T15:14:13Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Temporal-wise Attention Spiking Neural Networks for Event Streams
Classification [6.623034896340885]
Spiking neural network (SNN) is a brain-triggered event-triggered computing model.
In this work, we propose a temporal-wise attention SNN model to learn frame-based representation for processing event streams.
We demonstrate that TA-SNN models improve the accuracy of event streams classification tasks.
arXiv Detail & Related papers (2021-07-25T02:28:44Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Event-Based Angular Velocity Regression with Spiking Networks [51.145071093099396]
Spiking Neural Networks (SNNs) process information conveyed as temporal spikes rather than numeric values.
We propose, for the first time, a temporal regression problem of numerical values given events from an event camera.
We show that we can successfully train an SNN to perform angular velocity regression.
arXiv Detail & Related papers (2020-03-05T17:37:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.