Event-based Neural Spike Detection Using Spiking Neural Networks for Neuromorphic iBMI Systems
- URL: http://arxiv.org/abs/2505.06544v1
- Date: Sat, 10 May 2025 07:07:00 GMT
- Title: Event-based Neural Spike Detection Using Spiking Neural Networks for Neuromorphic iBMI Systems
- Authors: Chanwook Hwang, Biyan Zhou, Ye Ke, Vivek Mohan, Jong Hwan Ko, Arindam Basu,
- Abstract summary: Implantable brain-machine interfaces (iBMIs) are evolving to record from thousands of neurons wirelessly but face challenges in data bandwidth, power consumption, and implant size.<n>We propose a novel Spiking Neural Network Spike Detector (SNN-SPD) that processes event-based neural data generated via delta modulation and pulse count modulation, converting signals into sparse events.
- Score: 6.5271882730600455
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Implantable brain-machine interfaces (iBMIs) are evolving to record from thousands of neurons wirelessly but face challenges in data bandwidth, power consumption, and implant size. We propose a novel Spiking Neural Network Spike Detector (SNN-SPD) that processes event-based neural data generated via delta modulation and pulse count modulation, converting signals into sparse events. By leveraging the temporal dynamics and inherent sparsity of spiking neural networks, our method improves spike detection performance while maintaining low computational overhead suitable for implantable devices. Our experimental results demonstrate that the proposed SNN-SPD achieves an accuracy of 95.72% at high noise levels (standard deviation 0.2), which is about 2% higher than the existing Artificial Neural Network Spike Detector (ANN-SPD). Moreover, SNN-SPD requires only 0.41% of the computation and about 26.62% of the weight parameters compared to ANN-SPD, with zero multiplications. This approach balances efficiency and performance, enabling effective data compression and power savings for next-generation iBMIs.
Related papers
- Efficient Memristive Spiking Neural Networks Architecture with Supervised In-Situ STDP Method [0.0]
Memristor-based Spiking Neural Networks (SNNs) with temporal spike encoding enable ultra-low-energy computation.<n>This paper presents a circuit-level memristive spiking neural network (SNN) architecture trained using a proposed novel supervised in-situ learning algorithm.
arXiv Detail & Related papers (2025-07-28T17:09:48Z) - Neuromorphic Wireless Split Computing with Resonate-and-Fire Neurons [69.73249913506042]
This paper investigates a wireless split computing architecture that employs resonate-and-fire (RF) neurons to process time-domain signals directly.<n>By resonating at tunable frequencies, RF neurons extract time-localized spectral features while maintaining low spiking activity.<n> Experimental results show that the proposed RF-SNN architecture achieves comparable accuracy to conventional LIF-SNNs and ANNs.
arXiv Detail & Related papers (2025-06-24T21:14:59Z) - Delay Neural Networks (DeNN) for exploiting temporal information in event-based datasets [49.1574468325115]
Delay Neural Networks (DeNN) are designed to explicitly use exact continuous temporal information of spikes in both forward and backward passes.<n>Good performances are obtained, especially for datasets where temporal information is important.
arXiv Detail & Related papers (2025-01-10T14:58:15Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Neuromorphic Wireless Split Computing with Multi-Level Spikes [69.73249913506042]
Neuromorphic computing uses spiking neural networks (SNNs) to perform inference tasks.<n> embedding a small payload within each spike exchanged between spiking neurons can enhance inference accuracy without increasing energy consumption.<n> split computing - where an SNN is partitioned across two devices - is a promising solution.<n>This paper presents the first comprehensive study of a neuromorphic wireless split computing architecture that employs multi-level SNNs.
arXiv Detail & Related papers (2024-11-07T14:08:35Z) - Decoding finger velocity from cortical spike trains with recurrent spiking neural networks [6.404492073110551]
Invasive brain-machine interfaces (BMIs) can significantly improve the life quality of motor-impaired patients.
BMIs must meet strict latency and energy constraints while providing reliable decoding performance.
We trained RSNNs to decode finger velocity from cortical spike trains of two macaque monkeys.
arXiv Detail & Related papers (2024-09-03T10:15:33Z) - Canonic Signed Spike Coding for Efficient Spiking Neural Networks [7.524721345903027]
Spiking Neural Networks (SNNs) seek to mimic the spiking behavior of biological neurons and are expected to play a key role in the advancement of neural computing and artificial intelligence.<n>The conversion of Artificial Neural Networks (ANNs) to SNNs is the most widely used training method, which ensures that the resulting SNNs perform comparably to ANNs on large-scale datasets.<n>Current schemes typically use spike count or timing for encoding, which is linearly related to ANN activations and increases the required number of time steps.<n>We propose a novel Canonic Signed Spike (CSS) coding
arXiv Detail & Related papers (2024-08-30T12:39:25Z) - Automotive Object Detection via Learning Sparse Events by Spiking Neurons [20.930277906912394]
Spiking Neural Networks (SNNs) provide a temporal representation that is inherently aligned with event-based data.
We present a specialized spiking feature pyramid network (SpikeFPN) optimized for automotive event-based object detection.
arXiv Detail & Related papers (2023-07-24T15:47:21Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Voltage-Dependent Synaptic Plasticity (VDSP): Unsupervised probabilistic
Hebbian plasticity rule based on neurons membrane potential [5.316910132506153]
We propose a brain-inspired unsupervised local learning rule for the online implementation of Hebb's plasticity mechanism on neuromorphic hardware.
The proposed VDSP learning rule updates the synaptic conductance on the spike of the postsynaptic neuron only.
We report 85.01 $ pm $ 0.76% (Mean $ pm $ S.D.) accuracy for a network of 100 output neurons on the MNIST dataset.
arXiv Detail & Related papers (2022-03-21T14:39:02Z) - Hardware Implementation of Spiking Neural Networks Using
Time-To-First-Spike Encoding [5.709318189772638]
Hardware-based spiking neural networks (SNNs) are regarded as promising candidates for the cognitive computing system.
In this work, we train the SNN in which the firing time carries information using temporal backpropagation.
The temporally encoded SNN with 512 hidden neurons showed an accuracy of 96.90% for the MNIST test set.
arXiv Detail & Related papers (2020-06-09T03:31:15Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.