Training Beyond Convergence: Grokking nnU-Net for Glioma Segmentation in Sub-Saharan MRI
- URL: http://arxiv.org/abs/2601.22637v1
- Date: Fri, 30 Jan 2026 06:54:49 GMT
- Title: Training Beyond Convergence: Grokking nnU-Net for Glioma Segmentation in Sub-Saharan MRI
- Authors: Mohtady Barakat, Omar Salah, Ahmed Yasser, Mostafa Ahmed, Zahirul Arief, Waleed Khan, Dong Zhang, Aondona Iorumbur, Confidence Raymond, Mohannad Barakat, Noha Magdy,
- Abstract summary: Gliomas are placing an increasingly clinical burden on Sub-Saharan Africa (SSA)<n>In the region, the median survival for patients remains under two years, and access to diagnostic imaging is extremely limited.<n>We utilize the Brain Tumor (BraTS) Africa 2025 Challenge dataset, an annotated expert collection of glioma MRIs.
- Score: 4.495887610902666
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gliomas are placing an increasingly clinical burden on Sub-Saharan Africa (SSA). In the region, the median survival for patients remains under two years, and access to diagnostic imaging is extremely limited. These constraints highlight an urgent need for automated tools that can extract the maximum possible information from each available scan, tools that are specifically trained on local data, rather than adapted from high-income settings where conditions are vastly different. We utilize the Brain Tumor Segmentation (BraTS) Africa 2025 Challenge dataset, an expert annotated collection of glioma MRIs. Our objectives are: (i) establish a strong baseline with nnUNet on this dataset, and (ii) explore whether the celebrated "grokking" phenomenon an abrupt, late training jump from memorization to superior generalization can be triggered to push performance without extra labels. We evaluate two training regimes. The first is a fast, budget-conscious approach that limits optimization to just a few epochs, reflecting the constrained GPU resources typically available in African institutions. Despite this limitation, nnUNet achieves strong Dice scores: 92.3% for whole tumor (WH), 86.6% for tumor core (TC), and 86.3% for enhancing tumor (ET). The second regime extends training well beyond the point of convergence, aiming to trigger a grokking-driven performance leap. With this approach, we were able to achieve grokking and enhanced our results to higher Dice scores: 92.2% for whole tumor (WH), 90.1% for tumor core (TC), and 90.2% for enhancing tumor (ET).
Related papers
- Scaling Tumor Segmentation: Best Lessons from Real and Synthetic Data [62.63749675817477]
AbdomenAtlas 2.0 is a dataset of 10,135 CT scans with a total of 15,130 tumor instances per-voxel manually annotated in six organs.<n>It achieves notable improvements over public datasets, with a +7% gain on DSC tests and +16% on out-of-distribution tests.
arXiv Detail & Related papers (2025-10-16T16:08:09Z) - Scaling Artificial Intelligence for Multi-Tumor Early Detection with More Reports, Fewer Masks [59.37427210144734]
We introduce R-Super, which trains AI to segment tumors that match descriptions in medical reports.<n>When trained on 101,654 reports, AI models achieved performance comparable to those trained on 723 masks.<n>R-Super enabled segmentation of tumors in the spleen, gallbladder, prostate, bladder, uterus, and esophagus.
arXiv Detail & Related papers (2025-10-16T15:35:44Z) - DRBD-Mamba for Robust and Efficient Brain Tumor Segmentation with Analytical Insights [54.87947751720332]
Accurate brain tumor segmentation is significant for clinical diagnosis and treatment.<n>Mamba-based State Space Models have demonstrated promising performance.<n>We propose a dual-resolution bi-directional Mamba that captures multi-scale long-range dependencies with minimal computational overhead.
arXiv Detail & Related papers (2025-10-16T07:31:21Z) - Resource-Efficient Glioma Segmentation on Sub-Saharan MRI [4.522693679811991]
This study introduces a robust and computationally efficient deep learning framework tailored for resource-quality settings.<n>We leveraged a 3D Attention UNet architecture augmented with residual blocks and enhanced through transfer learning from pre-trained weights on the BraTS-Africa dataset.<n>Our model was evaluated on 95 MRI cases from the BraTS-Africa dataset, a benchmark for glioma segmentation in SSA MRI data.
arXiv Detail & Related papers (2025-09-11T13:52:47Z) - Efficient Brain Tumor Segmentation Using a Dual-Decoder 3D U-Net with Attention Gates (DDUNet) [0.0]
Cancer remains one of the leading causes of worldwide mortality, and among its many forms, brain tumors are particularly notorious.<n>Recent advances in artificial intelligence have shown great promise in assisting medical professionals with precise tumor segmentation.<n>We present a novel dual-decoder U-Net architecture enhanced with attention-gated skip connections, designed specifically for brain tumor segmentation from MRI scans.
arXiv Detail & Related papers (2025-04-14T22:45:33Z) - Adult Glioma Segmentation in Sub-Saharan Africa using Transfer Learning on Stratified Finetuning Data [6.14919256198409]
Gliomas present diagnostic challenges in low- and middle-income countries, particularly in Sub-Saharan Africa.<n>This paper introduces a novel approach to glioma segmentation using transfer learning to address challenges in resource-limited regions with minimal and low-quality MRI data.
arXiv Detail & Related papers (2024-12-05T12:29:12Z) - Transferring Knowledge from High-Quality to Low-Quality MRI for Adult Glioma Diagnosis [19.217710134003017]
This paper presents our work in the BraTS Challenge on SSA Adult Glioma.
We adopt the model from the BraTS-GLI 2021 winning solution and utilize it with three training strategies.
Results show that initial training on the BraTS-GLI 2021 dataset followed by fine-tuning on the BraTS-Africa dataset has yielded the best results.
arXiv Detail & Related papers (2024-10-24T12:48:12Z) - Federated Learning Enables Big Data for Rare Cancer Boundary Detection [98.5549882883963]
We present findings from the largest Federated ML study to-date, involving data from 71 healthcare institutions across 6 continents.
We generate an automatic tumor boundary detector for the rare disease of glioblastoma.
We demonstrate a 33% improvement over a publicly trained model to delineate the surgically targetable tumor, and 23% improvement over the tumor's entire extent.
arXiv Detail & Related papers (2022-04-22T17:27:00Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Ensemble CNN Networks for GBM Tumors Segmentation using Multi-parametric
MRI [0.0]
We propose a new aggregation of two deep learning frameworks namely, DeepSeg and nnU-Net for automatic glioblastoma recognition in pre-operative mpMRI.
Our ensemble method obtains Dice similarity scores of 92.00, 87.33, and 84.10 and Hausdorff Distances of 3.81, 8.91, and 16.02 for the enhancing tumor, tumor core, and whole tumor regions.
arXiv Detail & Related papers (2021-12-13T10:51:20Z) - Brain tumor segmentation with self-ensembled, deeply-supervised 3D U-net
neural networks: a BraTS 2020 challenge solution [56.17099252139182]
We automate and standardize the task of brain tumor segmentation with U-net like neural networks.
Two independent ensembles of models were trained, and each produced a brain tumor segmentation map.
Our solution achieved a Dice of 0.79, 0.89 and 0.84, as well as Hausdorff 95% of 20.4, 6.7 and 19.5mm on the final test dataset.
arXiv Detail & Related papers (2020-10-30T14:36:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.