Histomorphology-driven multi-instance learning for breast cancer WSI classification
- URL: http://arxiv.org/abs/2503.17983v1
- Date: Sun, 23 Mar 2025 08:25:29 GMT
- Title: Histomorphology-driven multi-instance learning for breast cancer WSI classification
- Authors: Baizhi Wang, Rui Yan, Wenxin Ma, Xu Zhang, Yuhao Wang, Xiaolong Li, Yunjie Gu, Zihang Jiang, S. Kevin Zhou,
- Abstract summary: Current whole slide image (WSI) classification methods struggle to effectively incorporate histomorphology information.<n>We propose a novel framework that explicitly incorporates histomorphology into WSI classification.
- Score: 37.70113409383555
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Histomorphology is crucial in breast cancer diagnosis. However, existing whole slide image (WSI) classification methods struggle to effectively incorporate histomorphology information, limiting their ability to capture key and fine-grained pathological features. To address this limitation, we propose a novel framework that explicitly incorporates histomorphology (tumor cellularity, cellular morphology, and tissue architecture) into WSI classification. Specifically, our approach consists of three key components: (1) estimating the importance of tumor-related histomorphology information at the patch level based on medical prior knowledge; (2) generating representative cluster-level features through histomorphology-driven cluster pooling; and (3) enabling WSI-level classification through histomorphology-driven multi-instance aggregation. With the incorporation of histomorphological information, our framework strengthens the model's ability to capture key and fine-grained pathological patterns, thereby enhancing WSI classification performance. Experimental results demonstrate its effectiveness, achieving high diagnostic accuracy for molecular subtyping and cancer subtyping. The code will be made available at https://github.com/Badgewho/HMDMIL.
Related papers
- Pathological Prior-Guided Multiple Instance Learning For Mitigating Catastrophic Forgetting in Breast Cancer Whole Slide Image Classification [53.45227142589896]
We propose a new framework PaGMIL to mitigate catastrophic forgetting in breast cancer WSI classification.<n>Our framework introduces two key components into the common MIL model architecture.<n>We evaluate the continual learning performance of PaGMIL across several public breast cancer datasets.
arXiv Detail & Related papers (2025-03-08T04:51:58Z) - MIRROR: Multi-Modal Pathological Self-Supervised Representation Learning via Modality Alignment and Retention [52.106879463828044]
Histopathology and transcriptomics are fundamental modalities in oncology, encapsulating the morphological and molecular aspects of the disease.<n>We present MIRROR, a novel multi-modal representation learning method designed to foster both modality alignment and retention.<n>Extensive evaluations on TCGA cohorts for cancer subtyping and survival analysis highlight MIRROR's superior performance.
arXiv Detail & Related papers (2025-03-01T07:02:30Z) - ActiveSSF: An Active-Learning-Guided Self-Supervised Framework for Long-Tailed Megakaryocyte Classification [3.6535793744942318]
We propose the ActiveSSF framework, which integrates active learning with self-supervised pretraining.<n> Experimental results on clinical megakaryocyte datasets demonstrate that ActiveSSF achieves state-of-the-art performance.<n>To foster further research, the code and datasets will be publicly released in the future.
arXiv Detail & Related papers (2025-02-12T08:24:36Z) - FECT: Classification of Breast Cancer Pathological Images Based on Fusion Features [1.9356426053533178]
We propose a novel breast cancer tissue classification model that Fused features of Edges, Cells, and Tissues (FECT)<n>Our model surpasses current advanced methods in terms of classification accuracy and F1 scores.<n>Our model exhibits interpretability and holds promise for significant roles in future clinical applications.
arXiv Detail & Related papers (2025-01-17T11:32:33Z) - HATs: Hierarchical Adaptive Taxonomy Segmentation for Panoramic Pathology Image Analysis [19.04633470168871]
Panoramic image segmentation in computational pathology presents a remarkable challenge due to the morphologically complex and variably scaled anatomy.
In this paper, we propose a novel Hierarchical Adaptive Taxonomy (HATs) method, which is designed to thoroughly segment panoramic views of kidney structures by leveraging detailed anatomical insights.
Our approach entails (1) the innovative HATs technique which translates spatial relationships among 15 distinct object classes into a versatile "plug-and-play" loss function that spans across regions, functional units, and cells, (2) the incorporation of anatomical hierarchies and scale considerations into a unified simple matrix representation for all panoramic entities, and (3) the
arXiv Detail & Related papers (2024-06-30T05:35:26Z) - Self-Supervised Graph Representation Learning for Neuronal Morphologies [75.38832711445421]
We present GraphDINO, a data-driven approach to learn low-dimensional representations of 3D neuronal morphologies from unlabeled datasets.
We show, in two different species and across multiple brain areas, that this method yields morphological cell type clusterings on par with manual feature-based classification by experts.
Our method could potentially enable data-driven discovery of novel morphological features and cell types in large-scale datasets.
arXiv Detail & Related papers (2021-12-23T12:17:47Z) - MCUa: Multi-level Context and Uncertainty aware Dynamic Deep Ensemble
for Breast Cancer Histology Image Classification [18.833782238355386]
We propose a novel CNN called Multi-level Context and Uncertainty aware (MCUa) dynamic deep learning ensemble model.
MCUamodel has achieved a high accuracy of 98.11% on a breast cancer histology image dataset.
arXiv Detail & Related papers (2021-08-24T13:18:57Z) - G-MIND: An End-to-End Multimodal Imaging-Genetics Framework for
Biomarker Identification and Disease Classification [49.53651166356737]
We propose a novel deep neural network architecture to integrate imaging and genetics data, as guided by diagnosis, that provides interpretable biomarkers.
We have evaluated our model on a population study of schizophrenia that includes two functional MRI (fMRI) paradigms and Single Nucleotide Polymorphism (SNP) data.
arXiv Detail & Related papers (2021-01-27T19:28:04Z) - Attention Model Enhanced Network for Classification of Breast Cancer
Image [54.83246945407568]
AMEN is formulated in a multi-branch fashion with pixel-wised attention model and classification submodular.
To focus more on subtle detail information, the sample image is enhanced by the pixel-wised attention map generated from former branch.
Experiments conducted on three benchmark datasets demonstrate the superiority of the proposed method under various scenarios.
arXiv Detail & Related papers (2020-10-07T08:44:21Z) - A Biologically Interpretable Two-stage Deep Neural Network (BIT-DNN) For
Vegetation Recognition From Hyperspectral Imagery [3.708283803668841]
This study proposes a novel interpretable deep learning model -- a biologically interpretable two-stage deep neural network (BIT-DNN)
The proposed model has been compared with five state-of-the-art deep learning models.
arXiv Detail & Related papers (2020-04-19T15:58:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.