DeepGleason: a System for Automated Gleason Grading of Prostate Cancer using Deep Neural Networks
- URL: http://arxiv.org/abs/2403.16678v1
- Date: Mon, 25 Mar 2024 12:15:42 GMT
- Title: DeepGleason: a System for Automated Gleason Grading of Prostate Cancer using Deep Neural Networks
- Authors: Dominik Müller, Philip Meyer, Lukas Rentschler, Robin Manz, Jonas Bäcker, Samantha Cramer, Christoph Wengenmayr, Bruno Märkl, Ralf Huss, Iñaki Soto-Rey, Johannes Raffler,
- Abstract summary: DeepGleason is an open-source deep neural network based image classification system for automated Gleason grading.
It is capable of highly accurate and reliable Gleason grading with a macro-averaged F1-score of 0.806, AUC of 0.991, and Accuracy of 0.974.
Our tool contributes to the wider adoption of AI-based Gleason grading within the research community.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Advances in digital pathology and artificial intelligence (AI) offer promising opportunities for clinical decision support and enhancing diagnostic workflows. Previous studies already demonstrated AI's potential for automated Gleason grading, but lack state-of-the-art methodology and model reusability. To address this issue, we propose DeepGleason: an open-source deep neural network based image classification system for automated Gleason grading using whole-slide histopathology images from prostate tissue sections. Implemented with the standardized AUCMEDI framework, our tool employs a tile-wise classification approach utilizing fine-tuned image preprocessing techniques in combination with a ConvNeXt architecture which was compared to various state-of-the-art architectures. The neural network model was trained and validated on an in-house dataset of 34,264 annotated tiles from 369 prostate carcinoma slides. We demonstrated that DeepGleason is capable of highly accurate and reliable Gleason grading with a macro-averaged F1-score of 0.806, AUC of 0.991, and Accuracy of 0.974. The internal architecture comparison revealed that the ConvNeXt model was superior performance-wise on our dataset to established and other modern architectures like transformers. Furthermore, we were able to outperform the current state-of-the-art in tile-wise fine-classification with a sensitivity and specificity of 0.94 and 0.98 for benign vs malignant detection as well as of 0.91 and 0.75 for Gleason 3 vs Gleason 4 & 5 classification, respectively. Our tool contributes to the wider adoption of AI-based Gleason grading within the research community and paves the way for broader clinical application of deep learning models in digital pathology. DeepGleason is open-source and publicly available for research application in the following Git repository: https://github.com/frankkramer-lab/DeepGleason.
Related papers
- Assessing the Performance of Deep Learning for Automated Gleason Grading in Prostate Cancer [0.0]
This study explores the potential of 11 deep neural network architectures for automated Gleason grading in prostate carcinoma.
A standardized image classification pipeline, based on the AUCMEDI framework, facilitated robust evaluation.
Newer architectures achieved superior performance, even though with challenges in differentiating closely related Gleason grades.
arXiv Detail & Related papers (2024-03-25T12:26:32Z) - DDxT: Deep Generative Transformer Models for Differential Diagnosis [51.25660111437394]
We show that a generative approach trained with simpler supervised and self-supervised learning signals can achieve superior results on the current benchmark.
The proposed Transformer-based generative network, named DDxT, autoregressively produces a set of possible pathologies, i.e., DDx, and predicts the actual pathology using a neural network.
arXiv Detail & Related papers (2023-12-02T22:57:25Z) - Weakly-Supervised Deep Learning Model for Prostate Cancer Diagnosis and
Gleason Grading of Histopathology Images [2.547129771651519]
We propose a weakly-supervised algorithm to classify prostate cancer grades.
The proposed algorithm consists of three steps: extracting discriminative areas in a histopathology image, representing the image, and classifying the image into its Gleason grades.
Results show that the proposed model achieved state-of-the-art performance in the Gleason grading task in terms of accuracy, F1 score, and cohen-kappa.
arXiv Detail & Related papers (2022-12-25T03:07:52Z) - Medical Application of Geometric Deep Learning for the Diagnosis of
Glaucoma [60.42955087779866]
3D scans of the optic nerve head (ONH) were acquired with Spectralis OCT for 477 glaucoma and 2,296 non-glaucoma subjects at the Singapore National Eye Centre.
All volumes were automatically segmented using deep learning to identify 7 major neural and connective tissues.
PointNet was able to provide a robust glaucoma diagnosis solely from the ONH represented as a 3D point cloud.
arXiv Detail & Related papers (2022-04-14T14:55:25Z) - Assessing glaucoma in retinal fundus photographs using Deep Feature
Consistent Variational Autoencoders [63.391402501241195]
glaucoma is challenging to detect since it remains asymptomatic until the symptoms are severe.
Early identification of glaucoma is generally made based on functional, structural, and clinical assessments.
Deep learning methods have partially solved this dilemma by bypassing the marker identification stage and analyzing high-level information directly to classify the data.
arXiv Detail & Related papers (2021-10-04T16:06:49Z) - Vision Transformers for femur fracture classification [59.99241204074268]
The Vision Transformer (ViT) was able to correctly predict 83% of the test images.
Good results were obtained in sub-fractures with the largest and richest dataset ever.
arXiv Detail & Related papers (2021-08-07T10:12:42Z) - A multi-stage machine learning model on diagnosis of esophageal
manometry [50.591267188664666]
The framework includes deep-learning models at the swallow-level stage and feature-based machine learning models at the study-level stage.
This is the first artificial-intelligence-style model to automatically predict CC diagnosis of HRM study from raw multi-swallow data.
arXiv Detail & Related papers (2021-06-25T20:09:23Z) - Going Deeper through the Gleason Scoring Scale: An Automatic end-to-end
System for Histology Prostate Grading and Cribriform Pattern Detection [7.929433631399375]
The objective of this work is to develop a deep-learning-based system able to support pathologists in the daily analysis of prostate biopsies.
The methodological core of this work is a patch-wise predictive model based on convolutional neural networks able to determine the presence of cancerous patterns.
arXiv Detail & Related papers (2021-05-21T17:51:53Z) - WeGleNet: A Weakly-Supervised Convolutional Neural Network for the
Semantic Segmentation of Gleason Grades in Prostate Histology Images [1.52819437883813]
We propose a deep-learning-based system able to detect local cancerous patterns in the prostate tissue using only the global-level Gleason score during training.
We obtained a Cohen's quadratic kappa (k) of 0.67 for the pixel-level prediction of cancerous patterns in the validation cohort.
We compared the model performance for semantic segmentation of Gleason grades with supervised state-of-the-art architectures in the test cohort.
arXiv Detail & Related papers (2021-05-21T16:27:16Z) - Automated Prostate Cancer Diagnosis Based on Gleason Grading Using
Convolutional Neural Network [12.161266795282915]
We propose a convolutional neural network (CNN)-based automatic classification method for accurate grading of prostate cancer (PCa) using whole slide histopathology images.
A data augmentation method named Patch-Based Image Reconstruction (PBIR) was proposed to reduce the high resolution and increase the diversity of WSIs.
A distribution correction module was developed to enhance the adaption of pretrained model to the target dataset.
arXiv Detail & Related papers (2020-11-29T06:42:08Z) - Gleason Grading of Histology Prostate Images through Semantic
Segmentation via Residual U-Net [60.145440290349796]
The final diagnosis of prostate cancer is based on the visual detection of Gleason patterns in prostate biopsy by pathologists.
Computer-aided-diagnosis systems allow to delineate and classify the cancerous patterns in the tissue.
The methodological core of this work is a U-Net convolutional neural network for image segmentation modified with residual blocks able to segment cancerous tissue.
arXiv Detail & Related papers (2020-05-22T19:49:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.