Comparison of automatic prostate zones segmentation models in MRI images
using U-net-like architectures
- URL: http://arxiv.org/abs/2207.09483v1
- Date: Tue, 19 Jul 2022 18:00:41 GMT
- Title: Comparison of automatic prostate zones segmentation models in MRI images
using U-net-like architectures
- Authors: Pablo Cesar Quihui-Rubio and Gilberto Ochoa-Ruiz and Miguel
Gonzalez-Mendoza and Gerardo Rodriguez-Hernandez and Christian Mata
- Abstract summary: Prostate cancer is the sixth leading cause of cancer death in males worldwide.
Currently, the segmentation of Regions of Interest (ROI) containing a tumor tissue is carried out manually by expert doctors.
Several research works have tackled the challenge of automatically segmenting and extracting features of the ROI from magnetic resonance images.
In this work, six deep learning models were trained and analyzed with a dataset of MRI images obtained from the Centre Hospitalaire de Dijon and Universitat Politecnica de Catalunya.
- Score: 0.9786690381850356
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Prostate cancer is the second-most frequently diagnosed cancer and the sixth
leading cause of cancer death in males worldwide. The main problem that
specialists face during the diagnosis of prostate cancer is the localization of
Regions of Interest (ROI) containing a tumor tissue. Currently, the
segmentation of this ROI in most cases is carried out manually by expert
doctors, but the procedure is plagued with low detection rates (of about
27-44%) or overdiagnosis in some patients. Therefore, several research works
have tackled the challenge of automatically segmenting and extracting features
of the ROI from magnetic resonance images, as this process can greatly
facilitate many diagnostic and therapeutic applications. However, the lack of
clear prostate boundaries, the heterogeneity inherent to the prostate tissue,
and the variety of prostate shapes makes this process very difficult to
automate.In this work, six deep learning models were trained and analyzed with
a dataset of MRI images obtained from the Centre Hospitalaire de Dijon and
Universitat Politecnica de Catalunya. We carried out a comparison of multiple
deep learning models (i.e. U-Net, Attention U-Net, Dense-UNet, Attention
Dense-UNet, R2U-Net, and Attention R2U-Net) using categorical cross-entropy
loss function. The analysis was performed using three metrics commonly used for
image segmentation: Dice score, Jaccard index, and mean squared error. The
model that give us the best result segmenting all the zones was R2U-Net, which
achieved 0.869, 0.782, and 0.00013 for Dice, Jaccard and mean squared error,
respectively.
Related papers
- Assessing the performance of deep learning-based models for prostate
cancer segmentation using uncertainty scores [1.0499611180329804]
The aim is to improve the workflow of prostate cancer detection and diagnosis.
The top-performing model is the Attention R2U-Net, achieving a mean Intersection over Union (IoU) of 76.3% and Dice Similarity Coefficient (DSC) of 85% for segmenting all zones.
arXiv Detail & Related papers (2023-08-09T01:38:58Z) - MicroSegNet: A Deep Learning Approach for Prostate Segmentation on
Micro-Ultrasound Images [10.10595151162924]
Micro-ultrasound (micro-US) is a novel 29-MHz ultrasound technique that provides 3-4 times higher resolution than traditional ultrasound.
prostate segmentation on micro-US is challenging due to artifacts and indistinct borders between the prostate, bladder, and urethra in the midline.
This paper presents MicroSegNet, a multi-scale annotation-guided transformer UNet model designed specifically to tackle these challenges.
arXiv Detail & Related papers (2023-05-31T15:42:29Z) - Moving from 2D to 3D: volumetric medical image classification for rectal
cancer staging [62.346649719614]
preoperative discrimination between T2 and T3 stages is arguably both the most challenging and clinically significant task for rectal cancer treatment.
We present a volumetric convolutional neural network to accurately discriminate T2 from T3 stage rectal cancer with rectal MR volumes.
arXiv Detail & Related papers (2022-09-13T07:10:14Z) - A unified 3D framework for Organs at Risk Localization and Segmentation
for Radiation Therapy Planning [56.52933974838905]
Current medical workflow requires manual delineation of organs-at-risk (OAR)
In this work, we aim to introduce a unified 3D pipeline for OAR localization-segmentation.
Our proposed framework fully enables the exploitation of 3D context information inherent in medical imaging.
arXiv Detail & Related papers (2022-03-01T17:08:41Z) - Deep Learning Based Analysis of Prostate Cancer from MP-MRI [0.0]
The diagnosis of prostate cancer faces a problem with overdiagnosis that leads to damaging side effects due to unnecessary treatment.
This study aims to investigate the use of deep learning techniques to explore computer-aid diagnosis based on MRI as input.
arXiv Detail & Related papers (2021-06-02T12:42:35Z) - Wide & Deep neural network model for patch aggregation in CNN-based
prostate cancer detection systems [51.19354417900591]
Prostate cancer (PCa) is one of the leading causes of death among men, with almost 1.41 million new cases and around 375,000 deaths in 2020.
To perform an automatic diagnosis, prostate tissue samples are first digitized into gigapixel-resolution whole-slide images.
Small subimages called patches are extracted and predicted, obtaining a patch-level classification.
arXiv Detail & Related papers (2021-05-20T18:13:58Z) - A Cascaded Residual UNET for Fully Automated Segmentation of Prostate
and Peripheral Zone in T2-weighted 3D Fast Spin Echo Images [1.6710577107094644]
Multi-parametric MR images have been shown to be effective in the non-invasive diagnosis of prostate cancer.
We propose a fully automated cascaded deep learning architecture with residual blocks, Cascaded MRes-UNET, for segmentation of the prostate gland and the peripheral zone.
arXiv Detail & Related papers (2020-12-25T03:16:52Z) - Segmentation of the Myocardium on Late-Gadolinium Enhanced MRI based on
2.5 D Residual Squeeze and Excitation Deep Learning Model [55.09533240649176]
The aim of this work is to develop an accurate automatic segmentation method based on deep learning models for the myocardial borders on LGE-MRI.
A total number of 320 exams (with a mean number of 6 slices per exam) were used for training and 28 exams used for testing.
The performance analysis of the proposed ensemble model in the basal and middle slices was similar as compared to intra-observer study and slightly lower at apical slices.
arXiv Detail & Related papers (2020-05-27T20:44:38Z) - Gleason Grading of Histology Prostate Images through Semantic
Segmentation via Residual U-Net [60.145440290349796]
The final diagnosis of prostate cancer is based on the visual detection of Gleason patterns in prostate biopsy by pathologists.
Computer-aided-diagnosis systems allow to delineate and classify the cancerous patterns in the tissue.
The methodological core of this work is a U-Net convolutional neural network for image segmentation modified with residual blocks able to segment cancerous tissue.
arXiv Detail & Related papers (2020-05-22T19:49:10Z) - A Global Benchmark of Algorithms for Segmenting Late Gadolinium-Enhanced
Cardiac Magnetic Resonance Imaging [90.29017019187282]
" 2018 Left Atrium Challenge" using 154 3D LGE-MRIs, currently the world's largest cardiac LGE-MRI dataset.
Analyse of the submitted algorithms using technical and biological metrics was performed.
Results show the top method achieved a dice score of 93.2% and a mean surface to a surface distance of 0.7 mm.
arXiv Detail & Related papers (2020-04-26T08:49:17Z) - Breast Cancer Detection Using Convolutional Neural Networks [0.0]
Breast cancer is prevalent in Ethiopia that accounts 34% among women cancer patients.
Deep learning techniques are revolutionizing the field of medical image analysis.
Our model detects mass region and classifies them into benign or malignant abnormality in mammogram(MG) images at once.
arXiv Detail & Related papers (2020-03-17T19:41:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.