AutoPET Challenge: Combining nn-Unet with Swin UNETR Augmented by
Maximum Intensity Projection Classifier
- URL: http://arxiv.org/abs/2209.01112v1
- Date: Fri, 2 Sep 2022 15:20:28 GMT
- Title: AutoPET Challenge: Combining nn-Unet with Swin UNETR Augmented by
Maximum Intensity Projection Classifier
- Authors: Lars Heiliger, Zdravko Marinov, Andr\'e Ferreira, Jana Fragemann,
Jacob Murray, David Kersting, Rainer Stiefelhagen, Jens Kleesiek
- Abstract summary: AutoPET challenge provides a public data set with FDG-PET/CT scans from 900 patients.
Our solution achieves a Dice score of 72.12% on patients diagnosed with lung cancer, melanoma, and lymphoma.
- Score: 15.924886815041774
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Tumor volume and changes in tumor characteristics over time are important
biomarkers for cancer therapy. In this context, FDG-PET/CT scans are routinely
used for staging and re-staging of cancer, as the radiolabeled
fluorodeoxyglucose is taken up in regions of high metabolism. Unfortunately,
these regions with high metabolism are not specific to tumors and can also
represent physiological uptake by normal functioning organs, inflammation, or
infection, making detailed and reliable tumor segmentation in these scans a
demanding task. This gap in research is addressed by the AutoPET challenge,
which provides a public data set with FDG-PET/CT scans from 900 patients to
encourage further improvement in this field. Our contribution to this challenge
is an ensemble of two state-of-the-art segmentation models, the nn-Unet and the
Swin UNETR, augmented by a maximum intensity projection classifier that acts
like a gating mechanism. If it predicts the existence of lesions, both
segmentations are combined by a late fusion approach. Our solution achieves a
Dice score of 72.12\% on patients diagnosed with lung cancer, melanoma, and
lymphoma in our cross-validation. Code:
https://github.com/heiligerl/autopet_submission
Related papers
- Improving Breast Cancer Grade Prediction with Multiparametric MRI Created Using Optimized Synthetic Correlated Diffusion Imaging [71.91773485443125]
Grading plays a vital role in breast cancer treatment planning.
The current tumor grading method involves extracting tissue from patients, leading to stress, discomfort, and high medical costs.
This paper examines using optimized CDI$s$ to improve breast cancer grade prediction.
arXiv Detail & Related papers (2024-05-13T15:48:26Z) - AutoPET Challenge 2023: Sliding Window-based Optimization of U-Net [30.142259166452693]
FDG-PET scans may misinterpret irregular glucose consumption in healthy tissues as cancer.
The AutoPET challenge addresses this by providing a dataset of 1014 FDG-PET/CT studies.
arXiv Detail & Related papers (2023-09-21T14:34:17Z) - A Localization-to-Segmentation Framework for Automatic Tumor
Segmentation in Whole-Body PET/CT Images [8.0523823243864]
This paper proposes a localization-to-segmentation framework (L2SNet) for precise tumor segmentation.
L2SNet first localizes the possible lesions in the lesion localization phase and then uses the location cues to shape the segmentation results in the lesion segmentation phase.
Experiments with the MII Automated Lesion in Whole-Body FDG-PET/CT challenge dataset show that our method achieved a competitive result.
arXiv Detail & Related papers (2023-09-11T13:39:15Z) - CancerUniT: Towards a Single Unified Model for Effective Detection,
Segmentation, and Diagnosis of Eight Major Cancers Using a Large Collection
of CT Scans [45.83431075462771]
Human readers or radiologists routinely perform full-body multi-organ multi-disease detection and diagnosis in clinical practice.
Most medical AI systems are built to focus on single organs with a narrow list of a few diseases.
CancerUniT is a query-based Mask Transformer model with the output of multi-tumor prediction.
arXiv Detail & Related papers (2023-01-28T20:09:34Z) - Whole-body tumor segmentation of 18F -FDG PET/CT using a cascaded and
ensembled convolutional neural networks [2.735686397209314]
The goal of this study was to report the performance of a deep neural network designed to automatically segment regions suspected of cancer in whole-body 18F-FDG PET/CT images.
A cascaded approach was developed where a stacked ensemble of 3D UNET CNN processed the PET/CT images at a fixed 6mm resolution.
arXiv Detail & Related papers (2022-10-14T19:25:56Z) - A Pathologist-Informed Workflow for Classification of Prostate Glands in
Histopathology [62.997667081978825]
Pathologists diagnose and grade prostate cancer by examining tissue from needle biopsies on glass slides.
Cancer's severity and risk of metastasis are determined by the Gleason grade, a score based on the organization and morphology of prostate cancer glands.
This paper proposes an automated workflow that follows pathologists' textitmodus operandi, isolating and classifying multi-scale patches of individual glands.
arXiv Detail & Related papers (2022-09-27T14:08:19Z) - Automatic Tumor Segmentation via False Positive Reduction Network for
Whole-Body Multi-Modal PET/CT Images [12.885308856495353]
In PET/CT image assessment, automatic tumor segmentation is an important step.
Existing methods tend to over-segment the tumor regions and include regions such as the normal high organs, inflammation, and other infections.
We introduce a false positive reduction network to overcome this limitation.
arXiv Detail & Related papers (2022-09-16T04:01:14Z) - Federated Learning Enables Big Data for Rare Cancer Boundary Detection [98.5549882883963]
We present findings from the largest Federated ML study to-date, involving data from 71 healthcare institutions across 6 continents.
We generate an automatic tumor boundary detector for the rare disease of glioblastoma.
We demonstrate a 33% improvement over a publicly trained model to delineate the surgically targetable tumor, and 23% improvement over the tumor's entire extent.
arXiv Detail & Related papers (2022-04-22T17:27:00Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Multimodal Spatial Attention Module for Targeting Multimodal PET-CT Lung
Tumor Segmentation [11.622615048002567]
Multimodal spatial attention module (MSAM) learns to emphasize regions related to tumors.
MSAM can be applied to common backbone architectures and trained end-to-end.
arXiv Detail & Related papers (2020-07-29T10:27:22Z) - Segmentation for Classification of Screening Pancreatic Neuroendocrine
Tumors [72.65802386845002]
This work presents comprehensive results to detect in the early stage the pancreatic neuroendocrine tumors (PNETs) in abdominal CT scans.
To the best of our knowledge, this task has not been studied before as a computational task.
Our approach outperforms state-of-the-art segmentation networks and achieves a sensitivity of $89.47%$ at a specificity of $81.08%$.
arXiv Detail & Related papers (2020-04-04T21:21:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.