H&E-adversarial network: a convolutional neural network to learn
stain-invariant features through Hematoxylin & Eosin regression
- URL: http://arxiv.org/abs/2201.06329v2
- Date: Wed, 19 Jan 2022 16:29:37 GMT
- Title: H&E-adversarial network: a convolutional neural network to learn
stain-invariant features through Hematoxylin & Eosin regression
- Authors: Niccol\'o Marini, Manfredo Atzori, Sebastian Ot\'alora, Stephane
Marchand-Maillet, Henning M\"uller
- Abstract summary: This paper presents a novel method to train convolutional neural networks (CNNs) that better generalize on data including several colour variations.
The method, called H&E-adversarial CNN, exploits H&E matrix information to learn stain-invariant features during the training.
- Score: 1.7371375427784381
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Computational pathology is a domain that aims to develop algorithms to
automatically analyze large digitized histopathology images, called whole slide
images (WSI). WSIs are produced scanning thin tissue samples that are stained
to make specific structures visible. They show stain colour heterogeneity due
to different preparation and scanning settings applied across medical centers.
Stain colour heterogeneity is a problem to train convolutional neural networks
(CNN), the state-of-the-art algorithms for most computational pathology tasks,
since CNNs usually underperform when tested on images including different stain
variations than those within data used to train the CNN. Despite several
methods that were developed, stain colour heterogeneity is still an unsolved
challenge that limits the development of CNNs that can generalize on data from
several medical centers. This paper aims to present a novel method to train
CNNs that better generalize on data including several colour variations. The
method, called H&E-adversarial CNN, exploits H&E matrix information to learn
stain-invariant features during the training. The method is evaluated on the
classification of colon and prostate histopathology images, involving eleven
heterogeneous datasets, and compared with five other techniques used to handle
stain colour heterogeneity. H&E-adversarial CNNs show an improvement in
performance compared to the other algorithms, demonstrating that it can help to
better deal with stain colour heterogeneous images.
Related papers
- Color Equivariant Convolutional Networks [50.655443383582124]
CNNs struggle if there is data imbalance between color variations introduced by accidental recording conditions.
We propose Color Equivariant Convolutions ( CEConvs), a novel deep learning building block that enables shape feature sharing across the color spectrum.
We demonstrate the benefits of CEConvs in terms of downstream performance to various tasks and improved robustness to color changes, including train-test distribution shifts.
arXiv Detail & Related papers (2023-10-30T09:18:49Z) - On the ability of CNNs to extract color invariant intensity based
features for image classification [4.297070083645049]
Convolutional neural networks (CNNs) have demonstrated remarkable success in vision-related tasks.
Recent studies suggest that CNNs exhibit a bias toward texture instead of object shape in image classification tasks.
This paper investigates the ability of CNNs to adapt to different color distributions in an image while maintaining context and background.
arXiv Detail & Related papers (2023-07-13T00:36:55Z) - Convolutional Neural Network-Based Automatic Classification of
Colorectal and Prostate Tumor Biopsies Using Multispectral Imagery: System
Development Study [7.566742780233967]
We propose a CNN model for classifying colorectal and prostate tumors from multispectral images of biopsy samples.
Our results showed excellent performance, with an average test accuracy of 99.8% and 99.5% for the prostate and colorectal data sets, respectively.
The proposed CNN architecture was globally the best-performing system for classifying colorectal and prostate tumor images.
arXiv Detail & Related papers (2023-01-30T18:28:25Z) - Stain-invariant self supervised learning for histopathology image
analysis [74.98663573628743]
We present a self-supervised algorithm for several classification tasks within hematoxylin and eosin stained images of breast cancer.
Our method achieves the state-of-the-art performance on several publicly available breast cancer datasets.
arXiv Detail & Related papers (2022-11-14T18:16:36Z) - Data-Efficient Vision Transformers for Multi-Label Disease
Classification on Chest Radiographs [55.78588835407174]
Vision Transformers (ViTs) have not been applied to this task despite their high classification performance on generic images.
ViTs do not rely on convolutions but on patch-based self-attention and in contrast to CNNs, no prior knowledge of local connectivity is present.
Our results show that while the performance between ViTs and CNNs is on par with a small benefit for ViTs, DeiTs outperform the former if a reasonably large data set is available for training.
arXiv Detail & Related papers (2022-08-17T09:07:45Z) - Two-Stream Graph Convolutional Network for Intra-oral Scanner Image
Segmentation [133.02190910009384]
We propose a two-stream graph convolutional network (i.e., TSGCN) to handle inter-view confusion between different raw attributes.
Our TSGCN significantly outperforms state-of-the-art methods in 3D tooth (surface) segmentation.
arXiv Detail & Related papers (2022-04-19T10:41:09Z) - Stain Normalized Breast Histopathology Image Recognition using
Convolutional Neural Networks for Cancer Detection [9.826027427965354]
Recent advances have shown that the convolutional Neural Network (CNN) architectures can be used to design a Computer Aided Diagnostic (CAD) System for breast cancer detection.
We consider some contemporary CNN models for binary classification of breast histopathology images.
We have validated the trained CNN networks on a publicly available BreaKHis dataset, for 200x and 400x magnified histopathology images.
arXiv Detail & Related papers (2022-01-04T03:09:40Z) - Generative Adversarial U-Net for Domain-free Medical Image Augmentation [49.72048151146307]
The shortage of annotated medical images is one of the biggest challenges in the field of medical image computing.
In this paper, we develop a novel generative method named generative adversarial U-Net.
Our newly designed model is domain-free and generalizable to various medical images.
arXiv Detail & Related papers (2021-01-12T23:02:26Z) - Assessing The Importance Of Colours For CNNs In Object Recognition [70.70151719764021]
Convolutional neural networks (CNNs) have been shown to exhibit conflicting properties.
We demonstrate that CNNs often rely heavily on colour information while making a prediction.
We evaluate a model trained with congruent images on congruent, greyscale, and incongruent images.
arXiv Detail & Related papers (2020-12-12T22:55:06Z) - Learning Interpretable Microscopic Features of Tumor by Multi-task
Adversarial CNNs To Improve Generalization [1.7371375427784381]
Existing CNN models act as black boxes, not ensuring to the physicians that important diagnostic features are used by the model.
Here we show that our architecture, by learning end-to-end an uncertainty-based weighting combination of multi-task and adversarial losses, is encouraged to focus on pathology features.
Our results on breast lymph node tissue show significantly improved generalization in the detection of tumorous tissue, with best average AUC 0.89 (0.01) against the baseline AUC 0.86 (0.005)
arXiv Detail & Related papers (2020-08-04T12:10:35Z) - Stain Style Transfer of Histopathology Images Via Structure-Preserved
Generative Learning [31.254432319814864]
This study proposes two stain style transfer models, SSIM-GAN and DSCSI-GAN, based on the generative adversarial networks.
By cooperating structural preservation metrics and feedback of an auxiliary diagnosis net in learning, medical-relevant information is preserved in color-normalized images.
arXiv Detail & Related papers (2020-07-24T15:30:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.