DRiVE: Dynamic Recognition in VEhicles using snnTorch
- URL: http://arxiv.org/abs/2502.10421v1
- Date: Tue, 04 Feb 2025 11:01:13 GMT
- Title: DRiVE: Dynamic Recognition in VEhicles using snnTorch
- Authors: Heerak Vora, Param Pathak, Parul Bakaraniya,
- Abstract summary: Spiking Neural Networks (SNNs) mimic biological brain activity, processing data efficiently through an event-driven design.<n>This study combines SNNs with PyTorch's adaptable framework, snnTorch, to test their potential for image-based tasks.<n>We introduce DRiVE, a vehicle detection model that uses spiking neuron dynamics to classify images, achieving 94.8% accuracy and a near-perfect 0.99 AUC score.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking Neural Networks (SNNs) mimic biological brain activity, processing data efficiently through an event-driven design, wherein the neurons activate only when inputs exceed specific thresholds. Their ability to track voltage changes over time via membrane potential dynamics helps retain temporal information. This study combines SNNs with PyTorch's adaptable framework, snnTorch, to test their potential for image-based tasks. We introduce DRiVE, a vehicle detection model that uses spiking neuron dynamics to classify images, achieving 94.8% accuracy and a near-perfect 0.99 AUC score. These results highlight DRiVE's ability to distinguish vehicle classes effectively, challenging the notion that SNNs are limited to temporal data. As interest grows in energy-efficient neural models, DRiVE's success emphasizes the need to refine SNN optimization for visual tasks. This work encourages broader exploration of SNNs in scenarios where conventional networks struggle, particularly for real-world applications requiring both precision and efficiency.
Related papers
- STEMS: Spatial-Temporal Mapping Tool For Spiking Neural Networks [5.144074723846297]
Spiking Neural Networks (SNNs) are promising bio-inspired third-generation neural networks.<n>Recent research has trained deep SNN models with accuracy on par with Artificial Neural Networks (ANNs)
arXiv Detail & Related papers (2025-02-05T15:44:15Z) - When Spiking neural networks meet temporal attention image decoding and adaptive spiking neuron [7.478056407323783]
Spiking Neural Networks (SNNs) are capable of encoding and processing temporal information in a biologically plausible way.
We propose a novel method for image decoding based on temporal attention (TAID) and an adaptive Leaky-Integrate-and-Fire neuron model.
arXiv Detail & Related papers (2024-06-05T08:21:55Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Spiking Neural Networks for Visual Place Recognition via Weighted
Neuronal Assignments [24.754429120321365]
Spiking neural networks (SNNs) offer both compelling potential advantages, including energy efficiency and low latencies.
One promising area for high performance SNNs is template matching and image recognition.
This research introduces the first high performance SNN for the Visual Place Recognition (VPR) task.
arXiv Detail & Related papers (2021-09-14T05:40:40Z) - BackEISNN: A Deep Spiking Neural Network with Adaptive Self-Feedback and
Balanced Excitatory-Inhibitory Neurons [8.956708722109415]
Spiking neural networks (SNNs) transmit information through discrete spikes, which performs well in processing spatial-temporal information.
We propose a deep spiking neural network with adaptive self-feedback and balanced excitatory and inhibitory neurons (BackEISNN)
For the MNIST, FashionMNIST, and N-MNIST datasets, our model has achieved state-of-the-art performance.
arXiv Detail & Related papers (2021-05-27T08:38:31Z) - Neuroevolution of a Recurrent Neural Network for Spatial and Working
Memory in a Simulated Robotic Environment [57.91534223695695]
We evolved weights in a biologically plausible recurrent neural network (RNN) using an evolutionary algorithm to replicate the behavior and neural activity observed in rats.
Our method demonstrates how the dynamic activity in evolved RNNs can capture interesting and complex cognitive behavior.
arXiv Detail & Related papers (2021-02-25T02:13:52Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.