Towards Practical Alzheimer's Disease Diagnosis: A Lightweight and Interpretable Spiking Neural Model
- URL: http://arxiv.org/abs/2506.09695v2
- Date: Mon, 07 Jul 2025 08:47:01 GMT
- Title: Towards Practical Alzheimer's Disease Diagnosis: A Lightweight and Interpretable Spiking Neural Model
- Authors: Changwei Wu, Yifei Chen, Yuxin Du, Jinying Zong, Jie Dong, Mingxuan Liu, Yong Peng, Jin Fan, Feiwei Qin, Changmiao Wang,
- Abstract summary: Early diagnosis of Alzheimer's Disease (AD) is vital yet hindered by subjective assessments and the high cost of multimodal imaging modalities.<n>As a brain-inspired paradigm, spiking neural networks (SNNs) are inherently well-suited for modeling the sparse, event-driven patterns of neural degeneration in AD.<n>We propose FasterSNN, a hybrid neural architecture that integrates biologically inspired LIF neurons with region-adaptive convolution and multi-scale spiking attention.
- Score: 7.289867430801027
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Early diagnosis of Alzheimer's Disease (AD), especially at the mild cognitive impairment (MCI) stage, is vital yet hindered by subjective assessments and the high cost of multimodal imaging modalities. Although deep learning methods offer automated alternatives, their energy inefficiency and computational demands limit real-world deployment, particularly in resource-constrained settings. As a brain-inspired paradigm, spiking neural networks (SNNs) are inherently well-suited for modeling the sparse, event-driven patterns of neural degeneration in AD, offering a promising foundation for interpretable and low-power medical diagnostics. However, existing SNNs often suffer from weak expressiveness and unstable training, which restrict their effectiveness in complex medical tasks. To address these limitations, we propose FasterSNN, a hybrid neural architecture that integrates biologically inspired LIF neurons with region-adaptive convolution and multi-scale spiking attention. This design enables sparse, efficient processing of 3D MRI while preserving diagnostic accuracy. Experiments on benchmark datasets demonstrate that FasterSNN achieves competitive performance with substantially improved efficiency and stability, supporting its potential for practical AD screening. Our source code is available at https://github.com/wuchangw/FasterSNN.
Related papers
- Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - Integrating Complexity and Biological Realism: High-Performance Spiking Neural Networks for Breast Cancer Detection [0.0]
Spiking Neural Networks (SNNs) event-driven nature enables efficient encoding of spatial and temporal features.<n>SNNs have seen limited application in medical image recognition due to difficulties in matching the performance of conventional deep learning models.<n>We propose a novel breast cancer classification approach that combines SNNs with Lempel-Ziv Complexity (LZC) a computationally efficient measure of sequence complexity.
arXiv Detail & Related papers (2025-06-06T17:47:27Z) - Adaptively Pruned Spiking Neural Networks for Energy-Efficient Intracortical Neural Decoding [0.06181089784338582]
Spiking Neural Networks (SNNs) on neuromorphic hardware have demonstrated remarkable efficiency in neural decoding.<n>We introduce a novel adaptive pruning algorithm specifically designed for SNNs with high activation sparsity, targeting intracortical neural decoding.
arXiv Detail & Related papers (2025-04-15T19:16:34Z) - Spiking Meets Attention: Efficient Remote Sensing Image Super-Resolution with Attention Spiking Neural Networks [57.17129753411926]
Spiking neural networks (SNNs) are emerging as a promising alternative to traditional artificial neural networks (ANNs)<n>We propose SpikeSR, which achieves state-of-the-art performance across various remote sensing benchmarks such as AID, DOTA, and DIOR.
arXiv Detail & Related papers (2025-03-06T09:06:06Z) - Decoding finger velocity from cortical spike trains with recurrent spiking neural networks [6.404492073110551]
Invasive brain-machine interfaces (BMIs) can significantly improve the life quality of motor-impaired patients.
BMIs must meet strict latency and energy constraints while providing reliable decoding performance.
We trained RSNNs to decode finger velocity from cortical spike trains of two macaque monkeys.
arXiv Detail & Related papers (2024-09-03T10:15:33Z) - Toward End-to-End Bearing Fault Diagnosis for Industrial Scenarios with Spiking Neural Networks [6.686258023516048]
Spiking neural networks (SNNs) transmit information via low-power binary spikes.
We propose a Multi-scale Residual Attention SNN (MRA-SNN) to improve efficiency, performance, and robustness of SNN methods.
MRA-SNN significantly outperforms existing methods in terms of accuracy, energy consumption noise robustness, and is more feasible for deployment in real-world industrial scenarios.
arXiv Detail & Related papers (2024-08-17T06:41:58Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - Long Short-term Memory with Two-Compartment Spiking Neuron [64.02161577259426]
We propose a novel biologically inspired Long Short-Term Memory Leaky Integrate-and-Fire spiking neuron model, dubbed LSTM-LIF.
Our experimental results, on a diverse range of temporal classification tasks, demonstrate superior temporal classification capability, rapid training convergence, strong network generalizability, and high energy efficiency of the proposed LSTM-LIF model.
This work, therefore, opens up a myriad of opportunities for resolving challenging temporal processing tasks on emerging neuromorphic computing machines.
arXiv Detail & Related papers (2023-07-14T08:51:03Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Efficient Uncertainty Estimation in Spiking Neural Networks via
MC-dropout [3.692069129522824]
Spiking neural networks (SNNs) have gained attention as models of sparse and event-driven communication of biological neurons.
We propose an efficient Monte Carlo(MC)-dropout based approach for uncertainty estimation in SNNs.
arXiv Detail & Related papers (2023-04-20T10:05:57Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.