Adaptively Pruned Spiking Neural Networks for Energy-Efficient Intracortical Neural Decoding
- URL: http://arxiv.org/abs/2504.11568v1
- Date: Tue, 15 Apr 2025 19:16:34 GMT
- Title: Adaptively Pruned Spiking Neural Networks for Energy-Efficient Intracortical Neural Decoding
- Authors: Francesca Rivelli, Martin Popov, Charalampos S. Kouzinopoulos, Guangzhi Tang,
- Abstract summary: Spiking Neural Networks (SNNs) on neuromorphic hardware have demonstrated remarkable efficiency in neural decoding.<n>We introduce a novel adaptive pruning algorithm specifically designed for SNNs with high activation sparsity, targeting intracortical neural decoding.
- Score: 0.06181089784338582
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Intracortical brain-machine interfaces demand low-latency, energy-efficient solutions for neural decoding. Spiking Neural Networks (SNNs) deployed on neuromorphic hardware have demonstrated remarkable efficiency in neural decoding by leveraging sparse binary activations and efficient spatiotemporal processing. However, reducing the computational cost of SNNs remains a critical challenge for developing ultra-efficient intracortical neural implants. In this work, we introduce a novel adaptive pruning algorithm specifically designed for SNNs with high activation sparsity, targeting intracortical neural decoding. Our method dynamically adjusts pruning decisions and employs a rollback mechanism to selectively eliminate redundant synaptic connections without compromising decoding accuracy. Experimental evaluation on the NeuroBench Non-Human Primate (NHP) Motor Prediction benchmark shows that our pruned network achieves performance comparable to dense networks, with a maximum tenfold improvement in efficiency. Moreover, hardware simulation on the neuromorphic processor reveals that the pruned network operates at sub-$\mu$W power levels, underscoring its potential for energy-constrained neural implants. These results underscore the promise of our approach for advancing energy-efficient intracortical brain-machine interfaces with low-overhead on-device intelligence.
Related papers
- Spiking Neural Network for Intra-cortical Brain Signal Decoding [20.79539749730775]
Decoding brain signals accurately and efficiently is crucial for intra-cortical brain-computer interfaces.<n>This paper proposes a spiking neural network (SNN) for effective and energy-efficient intra-cortical brain signal decoding.
arXiv Detail & Related papers (2025-04-12T13:41:59Z) - Threshold Neuron: A Brain-inspired Artificial Neuron for Efficient On-device Inference [17.95548501630064]
We propose a novel artificial neuron model, Threshold Neurons.<n>We construct neural networks similar to those with traditional artificial neurons, while significantly reducing hardware implementation complexity.<n>Our experiments validate the effectiveness of neural networks utilizing Threshold Neurons, achieving substantial power savings of 7.51x to 8.19x and area savings of 3.89x to 4.33x at the kernel level, with minimal loss in precision.
arXiv Detail & Related papers (2024-12-18T14:42:43Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Hybrid Spiking Neural Networks for Low-Power Intra-Cortical Brain-Machine Interfaces [42.72938925647165]
Intra-cortical brain-machine interfaces (iBMIs) have the potential to dramatically improve the lives of people with paraplegia.
Current iBMIs suffer from scalability and mobility limitations due to bulky hardware and wiring.
We are investigating hybrid spiking neural networks for embedded neural decoding in wireless iBMIs.
arXiv Detail & Related papers (2024-09-06T17:48:44Z) - Energy-efficient Spiking Neural Network Equalization for IM/DD Systems
with Optimized Neural Encoding [53.909333359654276]
We propose an energy-efficient equalizer for IM/DD systems based on spiking neural networks.
We optimize a neural spike encoding that boosts the equalizer's performance while decreasing energy consumption.
arXiv Detail & Related papers (2023-12-20T10:45:24Z) - Recent Advances in Scalable Energy-Efficient and Trustworthy Spiking
Neural networks: from Algorithms to Technology [11.479629320025673]
spiking neural networks (SNNs) have become an attractive alternative to deep neural networks for a broad range of signal processing applications.
We describe advances in algorithmic and optimization innovations to efficiently train and scale low-latency, and energy-efficient SNNs.
We discuss the potential path forward for research in building deployable SNN systems.
arXiv Detail & Related papers (2023-12-02T19:47:00Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Effective and Efficient Computation with Multiple-timescale Spiking
Recurrent Neural Networks [0.9790524827475205]
We show how a novel type of adaptive spiking recurrent neural network (SRNN) is able to achieve state-of-the-art performance.
We calculate a $>$100x energy improvement for our SRNNs over classical RNNs on the harder tasks.
arXiv Detail & Related papers (2020-05-24T01:04:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.