Semi-supervised Semantic Segmentation via Boosting Uncertainty on
Unlabeled Data
- URL: http://arxiv.org/abs/2311.18758v1
- Date: Thu, 30 Nov 2023 18:01:03 GMT
- Title: Semi-supervised Semantic Segmentation via Boosting Uncertainty on
Unlabeled Data
- Authors: Daoan Zhang, Yunhao Luo, Jianguo Zhang
- Abstract summary: We provide an analysis on the labeled and unlabeled distributions in training datasets.
We propose two strategies and design an uncertainty booster algorithm, specially for semi-supervised semantic segmentation.
Our approach achieves state-of-the-art performance in our experiments compared to the current semi-supervised semantic segmentation methods.
- Score: 6.318105712690353
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We bring a new perspective to semi-supervised semantic segmentation by
providing an analysis on the labeled and unlabeled distributions in training
datasets. We first figure out that the distribution gap between labeled and
unlabeled datasets cannot be ignored, even though the two datasets are sampled
from the same distribution. To address this issue, we theoretically analyze and
experimentally prove that appropriately boosting uncertainty on unlabeled data
can help minimize the distribution gap, which benefits the generalization of
the model. We propose two strategies and design an uncertainty booster
algorithm, specially for semi-supervised semantic segmentation. Extensive
experiments are carried out based on these theories, and the results confirm
the efficacy of the algorithm and strategies. Our plug-and-play uncertainty
booster is tiny, efficient, and robust to hyperparameters but can significantly
promote performance. Our approach achieves state-of-the-art performance in our
experiments compared to the current semi-supervised semantic segmentation
methods on the popular benchmarks: Cityscapes and PASCAL VOC 2012 with
different train settings.
Related papers
- Dual-Decoupling Learning and Metric-Adaptive Thresholding for Semi-Supervised Multi-Label Learning [81.83013974171364]
Semi-supervised multi-label learning (SSMLL) is a powerful framework for leveraging unlabeled data to reduce the expensive cost of collecting precise multi-label annotations.
Unlike semi-supervised learning, one cannot select the most probable label as the pseudo-label in SSMLL due to multiple semantics contained in an instance.
We propose a dual-perspective method to generate high-quality pseudo-labels.
arXiv Detail & Related papers (2024-07-26T09:33:53Z) - Self Adaptive Threshold Pseudo-labeling and Unreliable Sample Contrastive Loss for Semi-supervised Image Classification [6.920336485308536]
Pseudo-labeling-based semi-supervised approaches suffer from two problems in image classification.
We develop a self adaptive threshold pseudo-labeling strategy, which thresholds for each class can be dynamically adjusted to increase the number of reliable samples.
In order to effectively utilise unlabeled data with confidence below the thresholds, we propose an unreliable sample contrastive loss.
arXiv Detail & Related papers (2024-07-04T03:04:56Z) - EPL: Evidential Prototype Learning for Semi-supervised Medical Image Segmentation [0.0]
We propose Evidential Prototype Learning (EPL) to fuse voxel probability predictions from different sources and prototype fusion utilization of labeled and unlabeled data.
The uncertainty not only enables the model to self-correct predictions but also improves the guided learning process with pseudo-labels and is able to feed back into the construction of hidden features.
arXiv Detail & Related papers (2024-04-09T10:04:06Z) - All Points Matter: Entropy-Regularized Distribution Alignment for
Weakly-supervised 3D Segmentation [67.30502812804271]
Pseudo-labels are widely employed in weakly supervised 3D segmentation tasks where only sparse ground-truth labels are available for learning.
We propose a novel learning strategy to regularize the generated pseudo-labels and effectively narrow the gaps between pseudo-labels and model predictions.
arXiv Detail & Related papers (2023-05-25T08:19:31Z) - Adaptive Negative Evidential Deep Learning for Open-set Semi-supervised Learning [69.81438976273866]
Open-set semi-supervised learning (Open-set SSL) considers a more practical scenario, where unlabeled data and test data contain new categories (outliers) not observed in labeled data (inliers)
We introduce evidential deep learning (EDL) as an outlier detector to quantify different types of uncertainty, and design different uncertainty metrics for self-training and inference.
We propose a novel adaptive negative optimization strategy, making EDL more tailored to the unlabeled dataset containing both inliers and outliers.
arXiv Detail & Related papers (2023-03-21T09:07:15Z) - Rethinking Clustering-Based Pseudo-Labeling for Unsupervised
Meta-Learning [146.11600461034746]
Method for unsupervised meta-learning, CACTUs, is a clustering-based approach with pseudo-labeling.
This approach is model-agnostic and can be combined with supervised algorithms to learn from unlabeled data.
We prove that the core reason for this is lack of a clustering-friendly property in the embedding space.
arXiv Detail & Related papers (2022-09-27T19:04:36Z) - Uncertainty-Guided Mutual Consistency Learning for Semi-Supervised
Medical Image Segmentation [9.745971699005857]
We propose a novel uncertainty-guided mutual consistency learning framework for medical image segmentation.
It integrates intra-task consistency learning from up-to-date predictions for self-ensembling and cross-task consistency learning from task-level regularization to exploit geometric shape information.
Our method achieves performance gains by leveraging unlabeled data and outperforms existing semi-supervised segmentation methods.
arXiv Detail & Related papers (2021-12-05T08:19:41Z) - Adaptive Affinity Loss and Erroneous Pseudo-Label Refinement for Weakly
Supervised Semantic Segmentation [48.294903659573585]
In this paper, we propose to embed affinity learning of multi-stage approaches in a single-stage model.
A deep neural network is used to deliver comprehensive semantic information in the training phase.
Experiments are conducted on the PASCAL VOC 2012 dataset to evaluate the effectiveness of our proposed approach.
arXiv Detail & Related papers (2021-08-03T07:48:33Z) - Re-distributing Biased Pseudo Labels for Semi-supervised Semantic
Segmentation: A Baseline Investigation [30.688753736660725]
We present a simple and yet effective Distribution Alignment and Random Sampling (DARS) method to produce unbiased pseudo labels.
Our method performs favorably in comparison with state-of-the-art approaches.
arXiv Detail & Related papers (2021-07-23T14:45:14Z) - WSSOD: A New Pipeline for Weakly- and Semi-Supervised Object Detection [75.80075054706079]
We propose a weakly- and semi-supervised object detection framework (WSSOD)
An agent detector is first trained on a joint dataset and then used to predict pseudo bounding boxes on weakly-annotated images.
The proposed framework demonstrates remarkable performance on PASCAL-VOC and MSCOCO benchmark, achieving a high performance comparable to those obtained in fully-supervised settings.
arXiv Detail & Related papers (2021-05-21T11:58:50Z) - Bayesian Semi-supervised Crowdsourcing [71.20185379303479]
Crowdsourcing has emerged as a powerful paradigm for efficiently labeling large datasets and performing various learning tasks.
This work deals with semi-supervised crowdsourced classification, under two regimes of semi-supervision.
arXiv Detail & Related papers (2020-12-20T23:18:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.