RestainNet: a self-supervised digital re-stainer for stain normalization
- URL: http://arxiv.org/abs/2202.13804v1
- Date: Mon, 28 Feb 2022 14:05:42 GMT
- Title: RestainNet: a self-supervised digital re-stainer for stain normalization
- Authors: Bingchao Zhao, Jiatai Lin, Changhong Liang, Zongjian Yi, Xin Chen,
Bingbing Li, Weihao Qiu, Danyi Li, Li Liang, Chu Han, and Zaiyi Liu
- Abstract summary: We formulated stain normalization as a digital re-staining process and proposed a self-supervised learning model, which is called RestainNet.
Our network is regarded as a digital restainer which learns how to re-stain an unstained (grayscale) image.
- Score: 8.740191087897987
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Color inconsistency is an inevitable challenge in computational pathology,
which generally happens because of stain intensity variations or sections
scanned by different scanners. It harms the pathological image analysis
methods, especially the learning-based models. A series of approaches have been
proposed for stain normalization. However, most of them are lack flexibility in
practice. In this paper, we formulated stain normalization as a digital
re-staining process and proposed a self-supervised learning model, which is
called RestainNet. Our network is regarded as a digital restainer which learns
how to re-stain an unstained (grayscale) image. Two digital stains, Hematoxylin
(H) and Eosin (E) were extracted from the original image by Beer-Lambert's Law.
We proposed a staining loss to maintain the correctness of stain intensity
during the restaining process. Thanks to the self-supervised nature, paired
training samples are no longer necessary, which demonstrates great flexibility
in practical usage. Our RestainNet outperforms existing approaches and achieves
state-of-the-art performance with regard to color correctness and structure
preservation. We further conducted experiments on the segmentation and
classification tasks and the proposed RestainNet achieved outstanding
performance compared with SOTA methods. The self-supervised design allows the
network to learn any staining style with no extra effort.
Related papers
- IReNe: Instant Recoloring of Neural Radiance Fields [54.94866137102324]
We introduce IReNe, enabling swift, near real-time color editing in NeRF.
We leverage a pre-trained NeRF model and a single training image with user-applied color edits.
This adjustment allows the model to generate new scene views, accurately representing the color changes from the training image.
arXiv Detail & Related papers (2024-05-30T09:30:28Z) - Single color digital H&E staining with In-and-Out Net [0.8271394038014485]
This paper introduces a novel network, In-and-Out Net, specifically designed for virtual staining tasks.
Based on Generative Adversarial Networks (GAN), our model efficiently transforms Reflectance Confocal Microscopy (RCM) images into Hematoxylin and Eosin stained images.
arXiv Detail & Related papers (2024-05-22T01:17:27Z) - Iterative Token Evaluation and Refinement for Real-World
Super-Resolution [77.74289677520508]
Real-world image super-resolution (RWSR) is a long-standing problem as low-quality (LQ) images often have complex and unidentified degradations.
We propose an Iterative Token Evaluation and Refinement framework for RWSR.
We show that ITER is easier to train than Generative Adversarial Networks (GANs) and more efficient than continuous diffusion models.
arXiv Detail & Related papers (2023-12-09T17:07:32Z) - Stain Consistency Learning: Handling Stain Variation for Automatic
Digital Pathology Segmentation [3.2386272343130127]
We propose a novel framework combining stain-specific augmentation with a stain consistency loss function to learn stain colour invariant features.
We compare ten methods on Masson's trichrome and H&E stained cell and nuclei datasets, respectively.
We observed that stain normalisation methods resulted in equivalent or worse performance, while stain augmentation or stain adversarial methods demonstrated improved performance.
arXiv Detail & Related papers (2023-11-11T12:00:44Z) - Unsupervised Deep Digital Staining For Microscopic Cell Images Via
Knowledge Distillation [46.006296303296544]
It is difficult to obtain large-scale stained/unstained cell image pairs in practice.
We propose a novel unsupervised deep learning framework for the digital staining of cell images.
We show that the proposed unsupervised deep staining method can generate stained images with more accurate positions and shapes of the cell targets.
arXiv Detail & Related papers (2023-03-03T16:26:38Z) - Stain-invariant self supervised learning for histopathology image
analysis [74.98663573628743]
We present a self-supervised algorithm for several classification tasks within hematoxylin and eosin stained images of breast cancer.
Our method achieves the state-of-the-art performance on several publicly available breast cancer datasets.
arXiv Detail & Related papers (2022-11-14T18:16:36Z) - Seamless Iterative Semi-Supervised Correction of Imperfect Labels in
Microscopy Images [57.42492501915773]
In-vitro tests are an alternative to animal testing for the toxicity of medical devices.
Human fatigue plays a role in error making, making the use of deep learning appealing.
We propose Seamless Iterative Semi-Supervised correction of Imperfect labels (SISSI)
Our method successfully provides an adaptive early learning correction technique for object detection.
arXiv Detail & Related papers (2022-08-05T18:52:20Z) - RandStainNA: Learning Stain-Agnostic Features from Histology Slides by
Bridging Stain Augmentation and Normalization [45.81689497433507]
Two proposals, namely stain normalization (SN) and stain augmentation (SA), have been spotlighted to reduce the generalization error.
To address the problems, we unify SN and SA with a novel RandStainNA scheme.
The RandStainNA constrains variable stain styles in a practicable range to train a stain agnostic deep learning model.
arXiv Detail & Related papers (2022-06-25T16:43:59Z) - Structure-Preserving Multi-Domain Stain Color Augmentation using
Style-Transfer with Disentangled Representations [0.9051352746190446]
HistAuGAN can simulate a wide variety of realistic histology stain colors, thus making neural networks stain-invariant when applied during training.
Based on a generative adversarial network (GAN) for image-to-image translation, our model disentangles the content of the image, i.e., the morphological tissue structure, from the stain color attributes.
It can be trained on multiple domains and, therefore, learns to cover different stain colors as well as other domain-specific variations introduced in the slide preparation and imaging process.
arXiv Detail & Related papers (2021-07-26T17:52:39Z) - Histopathological Stain Transfer using Style Transfer Network with
Adversarial Loss [0.0]
We present a novel approach for the stain normalization problem using fast neural style transfer coupled with adversarial loss.
We also propose a novel stain transfer generator network based on High-Resolution Network (HRNet) which requires less training time and gives good generalization.
arXiv Detail & Related papers (2020-10-06T12:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.