UCC: Uncertainty guided Cross-head Co-training for Semi-Supervised
Semantic Segmentation
- URL: http://arxiv.org/abs/2205.10334v1
- Date: Fri, 20 May 2022 17:43:47 GMT
- Title: UCC: Uncertainty guided Cross-head Co-training for Semi-Supervised
Semantic Segmentation
- Authors: Jiashuo Fan, Bin Gao, Huan Jin, Lihui Jiang
- Abstract summary: We present a novel learning framework called Uncertainty guided Cross-head Co-training (UCC) for semi-supervised semantic segmentation.
Our framework introduces weak and strong augmentations within a shared encoder to achieve co-training, which naturally combines the benefits of consistency and self-training.
Our approach significantly outperforms other state-of-the-art semi-supervised semantic segmentation methods.
- Score: 2.6324267940354655
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep neural networks (DNNs) have witnessed great successes in semantic
segmentation, which requires a large number of labeled data for training. We
present a novel learning framework called Uncertainty guided Cross-head
Co-training (UCC) for semi-supervised semantic segmentation. Our framework
introduces weak and strong augmentations within a shared encoder to achieve
co-training, which naturally combines the benefits of consistency and
self-training. Every segmentation head interacts with its peers and, the weak
augmentation result is used for supervising the strong. The consistency
training samples' diversity can be boosted by Dynamic Cross-Set Copy-Paste
(DCSCP), which also alleviates the distribution mismatch and class imbalance
problems. Moreover, our proposed Uncertainty Guided Re-weight Module (UGRM)
enhances the self-training pseudo labels by suppressing the effect of the
low-quality pseudo labels from its peer via modeling uncertainty. Extensive
experiments on Cityscapes and PASCAL VOC 2012 demonstrate the effectiveness of
our UCC. Our approach significantly outperforms other state-of-the-art
semi-supervised semantic segmentation methods. It achieves 77.17$\%$, 76.49$\%$
mIoU on Cityscapes and PASCAL VOC 2012 datasets respectively under 1/16
protocols, which are +10.1$\%$, +7.91$\%$ better than the supervised baseline.
Related papers
- CRMSP: A Semi-supervised Approach for Key Information Extraction with Class-Rebalancing and Merged Semantic Pseudo-Labeling [10.886757419138343]
We propose a novel semi-supervised approach for KIE with Class-Rebalancing and Merged Semantic Pseudo-Labeling ( CRMSP)
CRP module introduces a reweighting factor to rebalance pseudo-labels, increasing attention to tail classes.
MSP module clusters tail features of unlabeled data by assigning samples to Merged Prototypes (MP)
arXiv Detail & Related papers (2024-07-19T07:41:26Z) - ECAP: Extensive Cut-and-Paste Augmentation for Unsupervised Domain
Adaptive Semantic Segmentation [4.082799056366928]
We propose an extensive cut-and-paste strategy (ECAP) to leverage reliable pseudo-labels through data augmentation.
ECAP maintains a memory bank of pseudo-labeled target samples throughout training and cut-and-pastes the most confident ones onto the current training batch.
We implement ECAP on top of the recent method MIC and boost its performance on two synthetic-to-real domain adaptation benchmarks.
arXiv Detail & Related papers (2024-03-06T17:06:07Z) - Adversarial Dual-Student with Differentiable Spatial Warping for
Semi-Supervised Semantic Segmentation [70.2166826794421]
We propose a differentiable geometric warping to conduct unsupervised data augmentation.
We also propose a novel adversarial dual-student framework to improve the Mean-Teacher.
Our solution significantly improves the performance and state-of-the-art results are achieved on both datasets.
arXiv Detail & Related papers (2022-03-05T17:36:17Z) - Guided Point Contrastive Learning for Semi-supervised Point Cloud
Semantic Segmentation [90.2445084743881]
We present a method for semi-supervised point cloud semantic segmentation to adopt unlabeled point clouds in training to boost the model performance.
Inspired by the recent contrastive loss in self-supervised tasks, we propose the guided point contrastive loss to enhance the feature representation and model generalization ability.
arXiv Detail & Related papers (2021-10-15T16:38:54Z) - Adaptive Affinity Loss and Erroneous Pseudo-Label Refinement for Weakly
Supervised Semantic Segmentation [48.294903659573585]
In this paper, we propose to embed affinity learning of multi-stage approaches in a single-stage model.
A deep neural network is used to deliver comprehensive semantic information in the training phase.
Experiments are conducted on the PASCAL VOC 2012 dataset to evaluate the effectiveness of our proposed approach.
arXiv Detail & Related papers (2021-08-03T07:48:33Z) - Semi-supervised Contrastive Learning with Similarity Co-calibration [72.38187308270135]
We propose a novel training strategy, termed as Semi-supervised Contrastive Learning (SsCL)
SsCL combines the well-known contrastive loss in self-supervised learning with the cross entropy loss in semi-supervised learning.
We show that SsCL produces more discriminative representation and is beneficial to few shot learning.
arXiv Detail & Related papers (2021-05-16T09:13:56Z) - A Simple Baseline for Semi-supervised Semantic Segmentation with Strong
Data Augmentation [74.8791451327354]
We propose a simple yet effective semi-supervised learning framework for semantic segmentation.
A set of simple design and training techniques can collectively improve the performance of semi-supervised semantic segmentation significantly.
Our method achieves state-of-the-art results in the semi-supervised settings on the Cityscapes and Pascal VOC datasets.
arXiv Detail & Related papers (2021-04-15T06:01:39Z) - Dynamic Divide-and-Conquer Adversarial Training for Robust Semantic
Segmentation [79.42338812621874]
Adversarial training is promising for improving robustness of deep neural networks towards adversarial perturbations.
We formulate a general adversarial training procedure that can perform decently on both adversarial and clean samples.
We propose a dynamic divide-and-conquer adversarial training (DDC-AT) strategy to enhance the defense effect.
arXiv Detail & Related papers (2020-03-14T05:06:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.