NNDM: NN_UNet Diffusion Model for Brain Tumor Segmentation
- URL: http://arxiv.org/abs/2510.09681v1
- Date: Wed, 08 Oct 2025 22:19:08 GMT
- Title: NNDM: NN_UNet Diffusion Model for Brain Tumor Segmentation
- Authors: Sashank Makanaboyina,
- Abstract summary: We propose NNDM (NN_UNet Diffusion Model) a hybrid framework that integrates the robust feature extraction of NN-UNet with the generative capabilities of diffusion probabilistic models.<n>In our approach, the diffusion model progressively refines the segmentation masks generated by NN-UNet by learning the residual error distribution between predicted and ground-truth masks.<n>Experiments conducted on the BraTS 2021 datasets demonstrate that NNDM achieves superior performance compared to conventional U-Net and transformer-based baselines.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurate detection and segmentation of brain tumors in magnetic resonance imaging (MRI) are critical for effective diagnosis and treatment planning. Despite advances in convolutional neural networks (CNNs) such as U-Net, existing models often struggle with generalization, boundary precision, and limited data diversity. To address these challenges, we propose NNDM (NN\_UNet Diffusion Model)a hybrid framework that integrates the robust feature extraction of NN-UNet with the generative capabilities of diffusion probabilistic models. In our approach, the diffusion model progressively refines the segmentation masks generated by NN-UNet by learning the residual error distribution between predicted and ground-truth masks. This iterative denoising process enables the model to correct fine structural inconsistencies and enhance tumor boundary delineation. Experiments conducted on the BraTS 2021 datasets demonstrate that NNDM achieves superior performance compared to conventional U-Net and transformer-based baselines, yielding improvements in Dice coefficient and Hausdorff distance metrics. Moreover, the diffusion-guided refinement enhances robustness across modalities and tumor subregions. The proposed NNDM establishes a new direction for combining deterministic segmentation networks with stochastic diffusion models, advancing the state of the art in automated brain tumor analysis.
Related papers
- Rethinking Message Passing Neural Networks with Diffusion Distance-guided Stress Majorization [22.36875245255393]
Message passing neural networks (MPNNs) have emerged as go-to models for learning on graphstructured data.<n>MPNNs still incur severe issues such as over-smoothing and -correlation.<n>We propose the new MPNN model built on an optimization framework that includes the stress majorization and regularization.
arXiv Detail & Related papers (2025-11-25T06:52:35Z) - Bayesian Neural Networks vs. Mixture Density Networks: Theoretical and Empirical Insights for Uncertainty-Aware Nonlinear Modeling [2.0797819204842036]
We compare the approaches of Bayesian Neural Networks (BNNs) and Mixture Density Networks (MDNs) for uncertainty-aware nonlinear regression.<n>On the theoretical side, we derive convergence rates and error bounds under H"older smoothness conditions, showing that MDNs achieve faster Kullback-Leibler (KL) divergence convergence.<n>Our findings clarify the complementary strengths of posterior-based and likelihood-based probabilistic learning, offering guidance for uncertainty-aware modeling in nonlinear systems.
arXiv Detail & Related papers (2025-10-28T22:00:30Z) - VGDM: Vision-Guided Diffusion Model for Brain Tumor Detection and Segmentation [2.538209532048867]
VGDM is a vision-guided diffusion framework for brain tumor detection and segmentation.<n>It embeds a vision transformer at the core of the diffusion process.<n>Transformer backbone enables more effective modeling of spatial relationships across entire MRI volumes.<n>It mitigates voxel-level errors and recovers fine-grained tumor details.
arXiv Detail & Related papers (2025-10-02T14:52:08Z) - uGMM-NN: Univariate Gaussian Mixture Model Neural Network [0.0]
uGMM-NN is a novel neural architecture that embeds probabilistic reasoning directly into the computational units of deep networks.<n>We demonstrate that uGMM-NN can achieve competitive discriminative performance compared to conventional multilayer perceptrons.
arXiv Detail & Related papers (2025-09-09T10:13:37Z) - ReDiSC: A Reparameterized Masked Diffusion Model for Scalable Node Classification with Structured Predictions [64.17845687013434]
We propose ReDiSC, a structured diffusion model for structured node classification.<n>We show that ReDiSC achieves superior or highly competitive performance compared to state-of-the-art GNN, label propagation, and diffusion-based baselines.<n> Notably, ReDiSC scales effectively to large-scale datasets on which previous structured diffusion methods fail due to computational constraints.
arXiv Detail & Related papers (2025-07-19T04:46:53Z) - GANet-Seg: Adversarial Learning for Brain Tumor Segmentation with Hybrid Generative Models [1.0456203870202954]
This work introduces a novel framework for brain tumor segmentation leveraging pre-trained GANs and Unet architectures.<n>By combining a global anomaly detection module with a refined mask generation network, the proposed model accurately identifies tumor-sensitive regions.<n>Multi-modal MRI data and synthetic image augmentation are employed to improve robustness and address the challenge of limited annotated datasets.
arXiv Detail & Related papers (2025-06-26T13:28:09Z) - Latent Diffusion Model Based Denoising Receiver for 6G Semantic Communication: From Stochastic Differential Theory to Application [11.385703484113552]
We propose a novel semantic communication framework empowered by generative artificial intelligence (GAI)<n>A latent diffusion model (LDM)-based semantic communication framework is proposed that combines a variational autoencoder for semantic features extraction.<n>The proposed system is a training-free framework that supports zero-shot generalization, and achieves superior performance under low-SNR and out-of-distribution conditions.
arXiv Detail & Related papers (2025-06-06T03:20:32Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Analyzing Neural Network-Based Generative Diffusion Models through Convex Optimization [45.72323731094864]
We present a theoretical framework to analyze two-layer neural network-based diffusion models.
We prove that training shallow neural networks for score prediction can be done by solving a single convex program.
Our results provide a precise characterization of what neural network-based diffusion models learn in non-asymptotic settings.
arXiv Detail & Related papers (2024-02-03T00:20:25Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - Influence Estimation and Maximization via Neural Mean-Field Dynamics [60.91291234832546]
We propose a novel learning framework using neural mean-field (NMF) dynamics for inference and estimation problems.
Our framework can simultaneously learn the structure of the diffusion network and the evolution of node infection probabilities.
arXiv Detail & Related papers (2021-06-03T00:02:05Z) - LocalDrop: A Hybrid Regularization for Deep Neural Networks [98.30782118441158]
We propose a new approach for the regularization of neural networks by the local Rademacher complexity called LocalDrop.
A new regularization function for both fully-connected networks (FCNs) and convolutional neural networks (CNNs) has been developed based on the proposed upper bound of the local Rademacher complexity.
arXiv Detail & Related papers (2021-03-01T03:10:11Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.