Virtual histological staining of unlabeled autopsy tissue
- URL: http://arxiv.org/abs/2308.00920v1
- Date: Wed, 2 Aug 2023 03:31:22 GMT
- Title: Virtual histological staining of unlabeled autopsy tissue
- Authors: Yuzhu Li, Nir Pillar, Jingxi Li, Tairan Liu, Di Wu, Songyu Sun,
Guangdong Ma, Kevin de Haan, Luzhe Huang, Sepehr Hamidi, Anatoly Urisman, Tal
Keidar Haran, William Dean Wallace, Jonathan E. Zuckerman, Aydogan Ozcan
- Abstract summary: We show that a trained neural network can transform autofluorescence images of label-free autopsy tissue sections into brightfield equivalent images that match hematoxylin and eosin stained versions of the same samples.
Our virtual autopsy staining technique can also be extended to necrotic tissue, and can rapidly and cost-effectively generate artifact-free H&E stains despite severe autolysis and cell death.
- Score: 1.9351365037275405
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Histological examination is a crucial step in an autopsy; however, the
traditional histochemical staining of post-mortem samples faces multiple
challenges, including the inferior staining quality due to autolysis caused by
delayed fixation of cadaver tissue, as well as the resource-intensive nature of
chemical staining procedures covering large tissue areas, which demand
substantial labor, cost, and time. These challenges can become more pronounced
during global health crises when the availability of histopathology services is
limited, resulting in further delays in tissue fixation and more severe
staining artifacts. Here, we report the first demonstration of virtual staining
of autopsy tissue and show that a trained neural network can rapidly transform
autofluorescence images of label-free autopsy tissue sections into brightfield
equivalent images that match hematoxylin and eosin (H&E) stained versions of
the same samples, eliminating autolysis-induced severe staining artifacts
inherent in traditional histochemical staining of autopsied tissue. Our virtual
H&E model was trained using >0.7 TB of image data and a data-efficient
collaboration scheme that integrates the virtual staining network with an image
registration network. The trained model effectively accentuated nuclear,
cytoplasmic and extracellular features in new autopsy tissue samples that
experienced severe autolysis, such as COVID-19 samples never seen before, where
the traditional histochemical staining failed to provide consistent staining
quality. This virtual autopsy staining technique can also be extended to
necrotic tissue, and can rapidly and cost-effectively generate artifact-free
H&E stains despite severe autolysis and cell death, also reducing labor, cost
and infrastructure requirements associated with the standard histochemical
staining.
Related papers
- Generating Seamless Virtual Immunohistochemical Whole Slide Images with Content and Color Consistency [2.063403009505468]
Immunohistochemical (IHC) stains play a vital role in a pathologist's analysis of medical images, providing crucial diagnostic information for various diseases.
Virtual staining from hematoxylin and eosin (H&E)-stained whole slide images (WSIs) allows the automatic production of other useful IHC stains without the expensive physical staining process.
Current virtual WSI generation methods based on tile-wise processing often suffer from inconsistencies in content, texture, and color at tile boundaries.
We propose a novel consistent WSI synthesis network, CC-WSI-Net, that extends GAN models to
arXiv Detail & Related papers (2024-10-01T21:02:16Z) - Label-free evaluation of lung and heart transplant biopsies using virtual staining [3.24061990641619]
The traditional histochemical staining process is time-consuming, costly, and labor-intensive.
We present a panel of virtual staining neural networks for lung and heart transplant biopsies.
Virtual staining networks consistently produce high-quality histology images with high color uniformity.
arXiv Detail & Related papers (2024-09-09T00:18:48Z) - Autonomous Quality and Hallucination Assessment for Virtual Tissue Staining and Digital Pathology [0.11728348229595655]
We present an autonomous quality and hallucination assessment method (termed AQuA) for virtual tissue staining.
AQuA achieves 99.8% accuracy when detecting acceptable and unacceptable virtually stained tissue images.
arXiv Detail & Related papers (2024-04-29T06:32:28Z) - Automated segmentation of rheumatoid arthritis immunohistochemistry
stained synovial tissue [0.0]
Rheumatoid Arthritis (RA) is a chronic, autoimmune disease which primarily affects the joint's synovial tissue.
It is a highly heterogeneous disease, with wide cellular and molecular variability observed in synovial tissues.
We train a UNET on a hand-curated, real-world multi-centre clinical dataset R4RA, which contains multiple types of IHC staining.
The model obtains a DICE score of 0.865 and successfully segments different types of IHC staining, as well as dealing with variance in colours, intensity and common WSIs artefacts from the different clinical centres.
arXiv Detail & Related papers (2023-09-13T18:43:14Z) - Digital staining in optical microscopy using deep learning -- a review [47.86254766044832]
Digital staining has emerged as a promising concept to use modern deep learning for the translation from optical contrast to established biochemical contrast of actual stainings.
We provide an in-depth analysis of the current state-of-the-art in this field, suggest methods of good practice, identify pitfalls and challenges and postulate promising advances towards potential future implementations and applications.
arXiv Detail & Related papers (2023-03-14T15:23:48Z) - Virtual stain transfer in histology via cascaded deep neural networks [2.309018557701645]
We demonstrate a virtual stain transfer framework via a cascaded deep neural network (C-DNN)
Unlike a single neural network structure which only takes one stain type as input to digitally output images of another stain type, C-DNN first uses virtual staining to transform autofluorescence microscopy images into H&E.
We successfully transferred the H&E-stained tissue images into virtual PAS (periodic acid-Schiff) stain.
arXiv Detail & Related papers (2022-07-14T00:43:18Z) - Lymphocyte Classification in Hyperspectral Images of Ovarian Cancer
Tissue Biopsy Samples [94.37521840642141]
We present a machine learning pipeline to segment white blood cell pixels in hyperspectral images of biopsy cores.
These cells are clinically important for diagnosis, but some prior work has struggled to incorporate them due to difficulty obtaining precise pixel labels.
arXiv Detail & Related papers (2022-03-23T00:58:27Z) - Texture Characterization of Histopathologic Images Using Ecological
Diversity Measures and Discrete Wavelet Transform [82.53597363161228]
This paper proposes a method for characterizing texture across histopathologic images with a considerable success rate.
It is possible to quantify the intrinsic properties of such images with promising accuracy on two HI datasets.
arXiv Detail & Related papers (2022-02-27T02:19:09Z) - Data-driven generation of plausible tissue geometries for realistic
photoacoustic image synthesis [53.65837038435433]
Photoacoustic tomography (PAT) has the potential to recover morphological and functional tissue properties.
We propose a novel approach to PAT data simulation, which we refer to as "learning to simulate"
We leverage the concept of Generative Adversarial Networks (GANs) trained on semantically annotated medical imaging data to generate plausible tissue geometries.
arXiv Detail & Related papers (2021-03-29T11:30:18Z) - Deep learning-based transformation of the H&E stain into special stains [44.38127957263123]
We show the utility of supervised learning-based computational stain transformation from H&E to different special stains using tissue sections from kidney needle core biopsies.
Results: The quality of the special stains generated by the stain transformation network was statistically equivalent to those generated through standard histochemical staining.
arXiv Detail & Related papers (2020-08-20T10:12:03Z) - Modeling and Enhancing Low-quality Retinal Fundus Images [167.02325845822276]
Low-quality fundus images increase uncertainty in clinical observation and lead to the risk of misdiagnosis.
We propose a clinically oriented fundus enhancement network (cofe-Net) to suppress global degradation factors.
Experiments on both synthetic and real images demonstrate that our algorithm effectively corrects low-quality fundus images without losing retinal details.
arXiv Detail & Related papers (2020-05-12T08:01:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.