ProstAttention-Net: A deep attention model for prostate cancer
segmentation by aggressiveness in MRI scans
- URL: http://arxiv.org/abs/2211.13238v1
- Date: Wed, 23 Nov 2022 16:21:21 GMT
- Title: ProstAttention-Net: A deep attention model for prostate cancer
segmentation by aggressiveness in MRI scans
- Authors: Audrey Duran (MYRIAD), Gaspard Dussert (MYRIAD), Olivier Rouvi\`ere,
Tristan Jaouen, Pierre-Marc Jodoin, Carole Lartizien (MYRIAD)
- Abstract summary: We propose a novel end-to-end multi-class network that jointly segments the prostate gland and cancer lesions with Gleason score (GS) group grading.
Our model achieves 69.0% $pm$14.5% sensitivity at 2.9 false positive per patient on the whole prostate and 70.8% $pm$14.4% sensitivity at 1.5 false positive when considering the peripheral zone (PZ) only.
- Score: 4.964026843682986
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multiparametric magnetic resonance imaging (mp-MRI) has shown excellent
results in the detection of prostate cancer (PCa). However, characterizing
prostate lesions aggressiveness in mp-MRI sequences is impossible in clinical
practice, and biopsy remains the reference to determine the Gleason score (GS).
In this work, we propose a novel end-to-end multi-class network that jointly
segments the prostate gland and cancer lesions with GS group grading. After
encoding the information on a latent space, the network is separated in two
branches: 1) the first branch performs prostate segmentation 2) the second
branch uses this zonal prior as an attention gate for the detection and grading
of prostate lesions. The model was trained and validated with a 5-fold
cross-validation on an heterogeneous series of 219 MRI exams acquired on three
different scanners prior prostatectomy. In the free-response receiver operating
characteristics (FROC) analysis for clinically significant lesions (defined as
GS > 6) detection, our model achieves 69.0% $\pm$14.5% sensitivity at 2.9 false
positive per patient on the whole prostate and 70.8% $\pm$14.4% sensitivity at
1.5 false positive when considering the peripheral zone (PZ) only. Regarding
the automatic GS group
Related papers
- Enhancing Clinically Significant Prostate Cancer Prediction in T2-weighted Images through Transfer Learning from Breast Cancer [71.91773485443125]
Transfer learning is a technique that leverages acquired features from a domain with richer data to enhance the performance of a domain with limited data.
In this paper, we investigate the improvement of clinically significant prostate cancer prediction in T2-weighted images through transfer learning from breast cancer.
arXiv Detail & Related papers (2024-05-13T15:57:27Z) - ProsDectNet: Bridging the Gap in Prostate Cancer Detection via
Transrectal B-mode Ultrasound Imaging [2.6024562346319167]
ProsDectNet is a multi-task deep learning approach that localizes prostate cancer on B-mode ultrasound.
We trained and validated ProsDectNet using a cohort of 289 patients who underwent MRI-TRUS fusion targeted biopsy.
Our results demonstrate that ProsDectNet has the potential to be used as a computer-aided diagnosis system.
arXiv Detail & Related papers (2023-12-08T19:40:35Z) - Cancer-Net PCa-Gen: Synthesis of Realistic Prostate Diffusion Weighted
Imaging Data via Anatomic-Conditional Controlled Latent Diffusion [68.45407109385306]
In Canada, prostate cancer is the most common form of cancer in men and accounted for 20% of new cancer cases for this demographic in 2022.
There has been significant interest in the development of deep neural networks for prostate cancer diagnosis, prognosis, and treatment planning using diffusion weighted imaging (DWI) data.
In this study, we explore the efficacy of latent diffusion for generating realistic prostate DWI data through the introduction of an anatomic-conditional controlled latent diffusion strategy.
arXiv Detail & Related papers (2023-11-30T15:11:03Z) - Prostate Lesion Estimation using Prostate Masks from Biparametric MRI [0.0]
Biparametric MRI has emerged as an alternative to multiparametric prostate MRI.
One major issue with biparametric MRI is difficulty to detect clinically significant prostate cancer (csPCA)
Deep learning algorithms have emerged as an alternative solution to detect csPCA in cohort studies.
arXiv Detail & Related papers (2023-01-11T13:20:24Z) - A Pathologist-Informed Workflow for Classification of Prostate Glands in
Histopathology [62.997667081978825]
Pathologists diagnose and grade prostate cancer by examining tissue from needle biopsies on glass slides.
Cancer's severity and risk of metastasis are determined by the Gleason grade, a score based on the organization and morphology of prostate cancer glands.
This paper proposes an automated workflow that follows pathologists' textitmodus operandi, isolating and classifying multi-scale patches of individual glands.
arXiv Detail & Related papers (2022-09-27T14:08:19Z) - Controlling False Positive/Negative Rates for Deep-Learning-Based
Prostate Cancer Detection on Multiparametric MR images [58.85481248101611]
We propose a novel PCa detection network that incorporates a lesion-level cost-sensitive loss and an additional slice-level loss based on a lesion-to-slice mapping function.
Our experiments based on 290 clinical patients concludes that 1) The lesion-level FNR was effectively reduced from 0.19 to 0.10 and the lesion-level FPR was reduced from 1.03 to 0.66 by changing the lesion-level cost.
arXiv Detail & Related papers (2021-06-04T09:51:27Z) - Deep Learning for fully automatic detection, segmentation, and Gleason
Grade estimation of prostate cancer in multiparametric Magnetic Resonance
Images [0.731365367571807]
This paper proposes a fully automatic system based on Deep Learning that takes a prostate mpMRI from a PCa-suspect patient.
It locates PCa lesions, segments them, and predicts their most likely Gleason grade group (GGG)
The code for the ProstateX-trained system has been made openly available at https://github.com/OscarPellicer/prostate_lesion_detection.
arXiv Detail & Related papers (2021-03-23T16:08:43Z) - A Cascaded Residual UNET for Fully Automated Segmentation of Prostate
and Peripheral Zone in T2-weighted 3D Fast Spin Echo Images [1.6710577107094644]
Multi-parametric MR images have been shown to be effective in the non-invasive diagnosis of prostate cancer.
We propose a fully automated cascaded deep learning architecture with residual blocks, Cascaded MRes-UNET, for segmentation of the prostate gland and the peripheral zone.
arXiv Detail & Related papers (2020-12-25T03:16:52Z) - Assisted Probe Positioning for Ultrasound Guided Radiotherapy Using
Image Sequence Classification [55.96221340756895]
Effective transperineal ultrasound image guidance in prostate external beam radiotherapy requires consistent alignment between probe and prostate at each session during patient set-up.
We demonstrate a method for ensuring accurate probe placement through joint classification of images and probe position data.
Using a multi-input multi-task algorithm, spatial coordinate data from an optically tracked ultrasound probe is combined with an image clas-sifier using a recurrent neural network to generate two sets of predictions in real-time.
The algorithm identified optimal probe alignment within a mean (standard deviation) range of 3.7$circ$ (1.2$circ$) from
arXiv Detail & Related papers (2020-10-06T13:55:02Z) - CorrSigNet: Learning CORRelated Prostate Cancer SIGnatures from
Radiology and Pathology Images for Improved Computer Aided Diagnosis [1.63324350193061]
We propose CorrSigNet, an automated two-step model that localizes prostate cancer on MRI.
First, the model learns MRI signatures of cancer that are correlated with corresponding histopathology features.
Second, the model uses the learned correlated MRI features to train a Convolutional Neural Network to localize prostate cancer.
arXiv Detail & Related papers (2020-07-31T23:44:25Z) - Segmentation for Classification of Screening Pancreatic Neuroendocrine
Tumors [72.65802386845002]
This work presents comprehensive results to detect in the early stage the pancreatic neuroendocrine tumors (PNETs) in abdominal CT scans.
To the best of our knowledge, this task has not been studied before as a computational task.
Our approach outperforms state-of-the-art segmentation networks and achieves a sensitivity of $89.47%$ at a specificity of $81.08%$.
arXiv Detail & Related papers (2020-04-04T21:21:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.