Leveraging Transformers to Improve Breast Cancer Classification and Risk
Assessment with Multi-modal and Longitudinal Data
- URL: http://arxiv.org/abs/2311.03217v2
- Date: Wed, 15 Nov 2023 14:37:24 GMT
- Title: Leveraging Transformers to Improve Breast Cancer Classification and Risk
Assessment with Multi-modal and Longitudinal Data
- Authors: Yiqiu Shen, Jungkyu Park, Frank Yeung, Eliana Goldberg, Laura Heacock,
Farah Shamout, Krzysztof J. Geras
- Abstract summary: Multi-modal Transformer (MMT) is a neural network that utilizes mammography and ultrasound synergistically.
MMT tracks temporal tissue changes by comparing current exams to prior imaging.
For 5-year risk prediction, MMT attains an AUROC of 0.826, outperforming prior mammography-based risk models.
- Score: 3.982926115291704
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Breast cancer screening, primarily conducted through mammography, is often
supplemented with ultrasound for women with dense breast tissue. However,
existing deep learning models analyze each modality independently, missing
opportunities to integrate information across imaging modalities and time. In
this study, we present Multi-modal Transformer (MMT), a neural network that
utilizes mammography and ultrasound synergistically, to identify patients who
currently have cancer and estimate the risk of future cancer for patients who
are currently cancer-free. MMT aggregates multi-modal data through
self-attention and tracks temporal tissue changes by comparing current exams to
prior imaging. Trained on 1.3 million exams, MMT achieves an AUROC of 0.943 in
detecting existing cancers, surpassing strong uni-modal baselines. For 5-year
risk prediction, MMT attains an AUROC of 0.826, outperforming prior
mammography-based risk models. Our research highlights the value of multi-modal
and longitudinal imaging in cancer diagnosis and risk stratification.
Related papers
- Enhancing Trust in Clinically Significant Prostate Cancer Prediction with Multiple Magnetic Resonance Imaging Modalities [61.36288157482697]
In the United States, prostate cancer is the second leading cause of deaths in males with a predicted 35,250 deaths in 2024.
In this paper, we investigate combining multiple MRI modalities to train a deep learning model to enhance trust in the models for clinically significant prostate cancer prediction.
arXiv Detail & Related papers (2024-11-07T12:48:27Z) - Towards Non-invasive and Personalized Management of Breast Cancer Patients from Multiparametric MRI via A Large Mixture-of-Modality-Experts Model [19.252851972152957]
We report a mixture-of-modality-experts model (MOME) that integrates multiparametric MRI information within a unified structure.
MOME demonstrated accurate and robust identification of breast cancer.
It could reduce the need for biopsies in BI-RADS 4 patients with a ratio of 7.3%, classify triple-negative breast cancer with an AUROC of 0.709, and predict pathological complete response to neoadjuvant chemotherapy with an AUROC of 0.694.
arXiv Detail & Related papers (2024-08-08T05:04:13Z) - Improving Breast Cancer Grade Prediction with Multiparametric MRI Created Using Optimized Synthetic Correlated Diffusion Imaging [71.91773485443125]
Grading plays a vital role in breast cancer treatment planning.
The current tumor grading method involves extracting tissue from patients, leading to stress, discomfort, and high medical costs.
This paper examines using optimized CDI$s$ to improve breast cancer grade prediction.
arXiv Detail & Related papers (2024-05-13T15:48:26Z) - Improved Prognostic Prediction of Pancreatic Cancer Using Multi-Phase CT
by Integrating Neural Distance and Texture-Aware Transformer [37.55853672333369]
This paper proposes a novel learnable neural distance that describes the precise relationship between the tumor and vessels in CT images of different patients.
The developed risk marker was the strongest predictor of overall survival among preoperative factors.
arXiv Detail & Related papers (2023-08-01T12:46:02Z) - Cancer-Net BCa-S: Breast Cancer Grade Prediction using Volumetric Deep
Radiomic Features from Synthetic Correlated Diffusion Imaging [82.74877848011798]
The prevalence of breast cancer continues to grow, affecting about 300,000 females in the United States in 2023.
The gold-standard Scarff-Bloom-Richardson (SBR) grade has been shown to consistently indicate a patient's response to chemotherapy.
In this paper, we study the efficacy of deep learning for breast cancer grading based on synthetic correlated diffusion (CDI$s$) imaging.
arXiv Detail & Related papers (2023-04-12T15:08:34Z) - A Multi-Institutional Open-Source Benchmark Dataset for Breast Cancer
Clinical Decision Support using Synthetic Correlated Diffusion Imaging Data [82.74877848011798]
Cancer-Net BCa is a multi-institutional open-source benchmark dataset of volumetric CDI$s$ imaging data of breast cancer patients.
Cancer-Net BCa is publicly available as a part of a global open-source initiative dedicated to accelerating advancement in machine learning to aid clinicians in the fight against cancer.
arXiv Detail & Related papers (2023-04-12T05:41:44Z) - RADIFUSION: A multi-radiomics deep learning based breast cancer risk
prediction model using sequential mammographic images with image attention
and bilateral asymmetry refinement [0.36355629235144304]
Our study highlights the importance of various deep learning mechanisms, such as image attention radiomic features, gating mechanism, and bilateral asymmetry-based fine-tuning.
Our findings suggest that RADIfusion can provide clinicians with a powerful tool for breast cancer risk assessment.
arXiv Detail & Related papers (2023-04-01T08:18:13Z) - EMT-NET: Efficient multitask network for computer-aided diagnosis of
breast cancer [58.720142291102135]
We propose an efficient and light-weighted learning architecture to classify and segment breast tumors simultaneously.
We incorporate a segmentation task into a tumor classification network, which makes the backbone network learn representations focused on tumor regions.
The accuracy, sensitivity, and specificity of tumor classification is 88.6%, 94.1%, and 85.3%, respectively.
arXiv Detail & Related papers (2022-01-13T05:24:40Z) - Learned super resolution ultrasound for improved breast lesion
characterization [52.77024349608834]
Super resolution ultrasound localization microscopy enables imaging of the microvasculature at the capillary level.
In this work we use a deep neural network architecture that makes effective use of signal structure to address these challenges.
By leveraging our trained network, the microvasculature structure is recovered in a short time, without prior PSF knowledge, and without requiring separability of the UCAs.
arXiv Detail & Related papers (2021-07-12T09:04:20Z) - Deep-CR MTLR: a Multi-Modal Approach for Cancer Survival Prediction with
Competing Risks [0.4189643331553922]
We present Deep-CR MTLR -- a novel machine learning approach for accurate cancer survival prediction.
We demonstrate improved prognostic performance of the multi-modal approach over single modality predictors in a cohort of 2552 head and neck cancer patients.
arXiv Detail & Related papers (2020-12-10T15:51:47Z) - Synthesizing lesions using contextual GANs improves breast cancer
classification on mammograms [0.4297070083645048]
We present a novel generative adversarial network (GAN) model for data augmentation that can realistically synthesize and remove lesions on mammograms.
With self-attention and semi-supervised learning components, the U-net-based architecture can generate high resolution (256x256px) outputs.
arXiv Detail & Related papers (2020-05-29T21:23:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.