Segmentation with Residual Attention U-Net and an Edge-Enhancement
Approach Preserves Cell Shape Features
- URL: http://arxiv.org/abs/2001.05548v1
- Date: Wed, 15 Jan 2020 20:44:39 GMT
- Title: Segmentation with Residual Attention U-Net and an Edge-Enhancement
Approach Preserves Cell Shape Features
- Authors: Nanyan Zhu, Chen Liu, Zakary S. Singer, Tal Danino, Andrew F. Laine,
Jia Guo
- Abstract summary: We modified the U-Net architecture to segment cells in fluorescence widefield microscopy images and quantitatively evaluated its performance.
With a 97% sensitivity, 93% specificity, 91% Jaccard similarity, and 95% Dice coefficient, our proposed method called Residual Attention U-Net with edge-enhancement surpassed the state-of-the-art U-Net in segmentation performance.
- Score: 12.676246022612533
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The ability to extrapolate gene expression dynamics in living single cells
requires robust cell segmentation, and one of the challenges is the amorphous
or irregularly shaped cell boundaries. To address this issue, we modified the
U-Net architecture to segment cells in fluorescence widefield microscopy images
and quantitatively evaluated its performance. We also proposed a novel loss
function approach that emphasizes the segmentation accuracy on cell boundaries
and encourages shape feature preservation. With a 97% sensitivity, 93%
specificity, 91% Jaccard similarity, and 95% Dice coefficient, our proposed
method called Residual Attention U-Net with edge-enhancement surpassed the
state-of-the-art U-Net in segmentation performance as evaluated by the
traditional metrics. More remarkably, the same proposed candidate also
performed the best in terms of the preservation of valuable shape features,
namely area, eccentricity, major axis length, solidity and orientation. These
improvements on shape feature preservation can serve as useful assets for
downstream cell tracking and quantification of changes in cell statistics or
features over time.
Related papers
- Cell as Point: One-Stage Framework for Efficient Cell Tracking [54.19259129722988]
This paper proposes the novel end-to-end CAP framework to achieve efficient and stable cell tracking in one stage.
CAP abandons detection or segmentation stages and simplifies the process by exploiting the correlation among the trajectories of cell points to track cells jointly.
Cap demonstrates strong cell tracking performance while also being 10 to 55 times more efficient than existing methods.
arXiv Detail & Related papers (2024-11-22T10:16:35Z) - LKCell: Efficient Cell Nuclei Instance Segmentation with Large Convolution Kernels [32.157968641130545]
We propose LKCell, a high-accuracy and efficient cell segmentation method.
Its core insight lies in unleashing the potential of large convolution kernels to achieve computationally efficient large receptive fields.
We analyze the redundancy of previous methods and design a new segmentation decoder based on large convolution kernels.
arXiv Detail & Related papers (2024-07-25T14:07:49Z) - Enhancing Cell Instance Segmentation in Scanning Electron Microscopy Images via a Deep Contour Closing Operator [0.04568852250743578]
This study presents an AI-driven approach for refining cell boundary delineation to improve instance-based cell segmentation in SEM images.
A CNN COp-Net is introduced to address gaps in cell contours, effectively filling in regions with deficient or absent information.
We showcase the efficacy of our approach in augmenting cell boundary precision using both private SEM images from PDX hepatoblastoma tissues and publicly accessible images datasets.
arXiv Detail & Related papers (2024-07-22T17:32:06Z) - Single-Cell Deep Clustering Method Assisted by Exogenous Gene
Information: A Novel Approach to Identifying Cell Types [50.55583697209676]
We develop an attention-enhanced graph autoencoder, which is designed to efficiently capture the topological features between cells.
During the clustering process, we integrated both sets of information and reconstructed the features of both cells and genes to generate a discriminative representation.
This research offers enhanced insights into the characteristics and distribution of cells, thereby laying the groundwork for early diagnosis and treatment of diseases.
arXiv Detail & Related papers (2023-11-28T09:14:55Z) - ARHNet: Adaptive Region Harmonization for Lesion-aware Augmentation to
Improve Segmentation Performance [61.04246102067351]
We propose a foreground harmonization framework (ARHNet) to tackle intensity disparities and make synthetic images look more realistic.
We demonstrate the efficacy of our method in improving the segmentation performance using real and synthetic images.
arXiv Detail & Related papers (2023-07-02T10:39:29Z) - TANGOS: Regularizing Tabular Neural Networks through Gradient
Orthogonalization and Specialization [69.80141512683254]
We introduce Tabular Neural Gradient Orthogonalization and gradient (TANGOS)
TANGOS is a novel framework for regularization in the tabular setting built on latent unit attributions.
We demonstrate that our approach can lead to improved out-of-sample generalization performance, outperforming other popular regularization methods.
arXiv Detail & Related papers (2023-03-09T18:57:13Z) - Toward Accurate and Reliable Iris Segmentation Using Uncertainty
Learning [96.72850130126294]
We propose an Iris U-transformer (IrisUsformer) for accurate and reliable iris segmentation.
For better accuracy, we elaborately design IrisUsformer by adopting position-sensitive operation and re-packaging transformer block.
We show that IrisUsformer achieves better segmentation accuracy using 35% MACs of the SOTA IrisParseNet.
arXiv Detail & Related papers (2021-10-20T01:37:19Z) - IH-GAN: A Conditional Generative Model for Implicit Surface-Based
Inverse Design of Cellular Structures [15.540823405781337]
We propose a deep generative model that generates diverse cellular unit cells conditioned on desired material properties.
Results show that our method can 1) generate various unit cells that satisfy given material properties with high accuracy (relative error 5%), 2) create functionally graded cellular structures with high-quality interface connectivity (98.7% average overlap area at interfaces), and 3) improve the structural performance over the conventional topology-optimized variable-density structure.
arXiv Detail & Related papers (2021-03-03T18:39:25Z) - Accurate Cell Segmentation in Digital Pathology Images via Attention
Enforced Networks [0.0]
We propose an Attention Enforced Network (AENet) to integrate local features with global dependencies and weight effective channels adaptively.
In the test stage, we present an individual color normalization method to deal with the stain variation problem.
arXiv Detail & Related papers (2020-12-14T03:39:33Z) - An Uncertainty-Driven GCN Refinement Strategy for Organ Segmentation [53.425900196763756]
We propose a segmentation refinement method based on uncertainty analysis and graph convolutional networks.
We employ the uncertainty levels of the convolutional network in a particular input volume to formulate a semi-supervised graph learning problem.
We show that our method outperforms the state-of-the-art CRF refinement method by improving the dice score by 1% for the pancreas and 2% for spleen.
arXiv Detail & Related papers (2020-12-06T18:55:07Z) - Learning to segment clustered amoeboid cells from brightfield microscopy
via multi-task learning with adaptive weight selection [6.836162272841265]
We introduce a novel supervised technique for cell segmentation in a multi-task learning paradigm.
A combination of a multi-task loss, based on the region and cell boundary detection, is employed for an improved prediction efficiency of the network.
We observe an overall Dice score of 0.93 on the validation set, which is an improvement of over 15.9% on a recent unsupervised method, and outperforms the popular supervised U-net algorithm by at least $5.8%$ on average.
arXiv Detail & Related papers (2020-05-19T11:31:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.