Reliable Brain Tumor Segmentation Based on Spiking Neural Networks with Efficient Training
- URL: http://arxiv.org/abs/2601.16652v1
- Date: Fri, 23 Jan 2026 11:16:34 GMT
- Title: Reliable Brain Tumor Segmentation Based on Spiking Neural Networks with Efficient Training
- Authors: Aurora Pia Ghiardelli, Guangzhi Tang, Tao Sun,
- Abstract summary: We propose a reliable and energy-efficient framework for 3D brain tumor segmentation using spiking neural networks (SNNs)<n>A multi-view ensemble of sagittal, coronal, and axial SNN models provides voxel-wise uncertainty estimation and enhances segmentation.
- Score: 3.855503898515455
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a reliable and energy-efficient framework for 3D brain tumor segmentation using spiking neural networks (SNNs). A multi-view ensemble of sagittal, coronal, and axial SNN models provides voxel-wise uncertainty estimation and enhances segmentation robustness. To address the high computational cost in training SNN models for semantic image segmentation, we employ Forward Propagation Through Time (FPTT), which maintains temporal learning efficiency with significantly reduced computational cost. Experiments on the Multimodal Brain Tumor Segmentation Challenges (BraTS 2017 and BraTS 2023) demonstrate competitive accuracy, well-calibrated uncertainty, and an 87% reduction in FLOPs, underscoring the potential of SNNs for reliable, low-power medical IoT and Point-of-Care systems.
Related papers
- General Self-Prediction Enhancement for Spiking Neurons [71.01912385372577]
Spiking Neural Networks (SNNs) are highly energy-efficient due to event-driven, sparse computation, but their training is challenged by spike non-differentiability and trade-offs among performance, efficiency, and biological plausibility.<n>We propose a self-prediction enhanced spiking neuron method that generates an internal prediction current from its input-output history to modulate membrane potential.<n>This design offers dual advantages, it creates a continuous gradient path that alleviates vanishing gradients and boosts training stability and accuracy, while also aligning with biological principles, which resembles distal dendritic modulation and error-driven synaptic plasticity.
arXiv Detail & Related papers (2026-01-29T15:08:48Z) - Towards Practical Alzheimer's Disease Diagnosis: A Lightweight and Interpretable Spiking Neural Model [7.289867430801027]
Early diagnosis of Alzheimer's Disease (AD) is vital yet hindered by subjective assessments and the high cost of multimodal imaging modalities.<n>As a brain-inspired paradigm, spiking neural networks (SNNs) are inherently well-suited for modeling the sparse, event-driven patterns of neural degeneration in AD.<n>We propose FasterSNN, a hybrid neural architecture that integrates biologically inspired LIF neurons with region-adaptive convolution and multi-scale spiking attention.
arXiv Detail & Related papers (2025-06-11T13:10:49Z) - Integrating Complexity and Biological Realism: High-Performance Spiking Neural Networks for Breast Cancer Detection [0.0]
Spiking Neural Networks (SNNs) event-driven nature enables efficient encoding of spatial and temporal features.<n>SNNs have seen limited application in medical image recognition due to difficulties in matching the performance of conventional deep learning models.<n>We propose a novel breast cancer classification approach that combines SNNs with Lempel-Ziv Complexity (LZC) a computationally efficient measure of sequence complexity.
arXiv Detail & Related papers (2025-06-06T17:47:27Z) - Adaptively Pruned Spiking Neural Networks for Energy-Efficient Intracortical Neural Decoding [0.06181089784338582]
Spiking Neural Networks (SNNs) on neuromorphic hardware have demonstrated remarkable efficiency in neural decoding.<n>We introduce a novel adaptive pruning algorithm specifically designed for SNNs with high activation sparsity, targeting intracortical neural decoding.
arXiv Detail & Related papers (2025-04-15T19:16:34Z) - Multiplication-Free Parallelizable Spiking Neurons with Efficient Spatio-Temporal Dynamics [40.43988645674521]
Spiking Neural Networks (SNNs) are distinguished from Artificial Neural Networks (ANNs) for their complex neuronal dynamics and sparse binary activations (spikes) inspired by the biological neural system.<n>Traditional neuron models use iterative step-by-step dynamics, resulting in serial computation and slow training speed of SNNs.<n>Recently, parallelizable spiking neuron models have been proposed to fully utilize the massive parallel computing ability of graphics processing units to accelerate the training of SNNs.
arXiv Detail & Related papers (2025-01-24T13:44:08Z) - Flexible and Scalable Deep Dendritic Spiking Neural Networks with Multiple Nonlinear Branching [39.664692909673086]
We propose the dendritic spiking neuron (DendSN) incorporating multiple dendritic branches with nonlinear dynamics.<n>Compared to the point spiking neurons, DendSN exhibits significantly higher expressivity.<n>Our work demonstrates the possibility of training bio-plausible dendritic SNNs with depths and scales comparable to traditional point SNNs.
arXiv Detail & Related papers (2024-12-09T10:15:46Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation [20.34272550256856]
Spiking neural networks (SNNs) mimic biological neural system to convey information via discrete spikes.
Our work achieves state-of-the-art performance for training SNNs on both static and neuromorphic datasets.
arXiv Detail & Related papers (2024-07-12T08:17:24Z) - Applying Dimensionality Reduction as Precursor to LSTM-CNN Models for
Classifying Imagery and Motor Signals in ECoG-Based BCIs [0.0]
This research aims to elevate the field by optimizing motor imagery classification algorithms within Brain-Computer Interfaces (BCIs)
We utilize unsupervised techniques for dimensionality reduction, namely Uniform Manifold Approximation and Projection (UMAP) and K-Nearest Neighbors (KNN)
We also evaluate the necessity of employing supervised methods such as Long Short-Term Memory (LSTM) and Convolutional Neural Networks (CNNs) for classification tasks.
arXiv Detail & Related papers (2023-11-22T16:34:06Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Pruning of Deep Spiking Neural Networks through Gradient Rewiring [41.64961999525415]
Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips.
Most existing methods directly apply pruning approaches in artificial neural networks (ANNs) to SNNs, which ignore the difference between ANNs and SNNs.
We propose gradient rewiring (Grad R), a joint learning algorithm of connectivity and weight for SNNs, that enables us to seamlessly optimize network structure without retrain.
arXiv Detail & Related papers (2021-05-11T10:05:53Z) - Stochastic Markov Gradient Descent and Training Low-Bit Neural Networks [77.34726150561087]
We introduce Gradient Markov Descent (SMGD), a discrete optimization method applicable to training quantized neural networks.
We provide theoretical guarantees of algorithm performance as well as encouraging numerical results.
arXiv Detail & Related papers (2020-08-25T15:48:15Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.