SimT: Handling Open-set Noise for Domain Adaptive Semantic Segmentation
- URL: http://arxiv.org/abs/2203.15202v1
- Date: Tue, 29 Mar 2022 02:48:08 GMT
- Title: SimT: Handling Open-set Noise for Domain Adaptive Semantic Segmentation
- Authors: Xiaoqing Guo, Jie Liu, Tongliang Liu and Yiyuan Yuan
- Abstract summary: This paper studies a practical domain adaptive (DA) semantic segmentation problem where only pseudo-labeled target data is accessible through a black-box model.
Due to the domain gap and label shift between two domains, pseudo-labeled target data contains mixed closed-set and open-set label noises.
We propose a simplex noise transition matrix (SimT) to model the mixed noise distributions in DA semantic segmentation and formulate the problem as estimation of SimT.
- Score: 58.61946589036262
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper studies a practical domain adaptive (DA) semantic segmentation
problem where only pseudo-labeled target data is accessible through a black-box
model. Due to the domain gap and label shift between two domains,
pseudo-labeled target data contains mixed closed-set and open-set label noises.
In this paper, we propose a simplex noise transition matrix (SimT) to model the
mixed noise distributions in DA semantic segmentation and formulate the problem
as estimation of SimT. By exploiting computational geometry analysis and
properties of segmentation, we design three complementary regularizers, i.e.
volume regularization, anchor guidance, convex guarantee, to approximate the
true SimT. Specifically, volume regularization minimizes the volume of simplex
formed by rows of the non-square SimT, which ensures outputs of segmentation
model to fit into the ground truth label distribution. To compensate for the
lack of open-set knowledge, anchor guidance and convex guarantee are devised to
facilitate the modeling of open-set noise distribution and enhance the
discriminative feature learning among closed-set and open-set classes. The
estimated SimT is further utilized to correct noise issues in pseudo labels and
promote the generalization ability of segmentation model on target domain data.
Extensive experimental results demonstrate that the proposed SimT can be
flexibly plugged into existing DA methods to boost the performance. The source
code is available at \url{https://github.com/CityU-AIM-Group/SimT}.
Related papers
- Inaccurate Label Distribution Learning with Dependency Noise [52.08553913094809]
We introduce the Dependent Noise-based Inaccurate Label Distribution Learning (DN-ILDL) framework to tackle the challenges posed by noise in label distribution learning.
We show that DN-ILDL effectively addresses the ILDL problem and outperforms existing LDL methods.
arXiv Detail & Related papers (2024-05-26T07:58:07Z) - Continual-MAE: Adaptive Distribution Masked Autoencoders for Continual Test-Time Adaptation [49.827306773992376]
Continual Test-Time Adaptation (CTTA) is proposed to migrate a source pre-trained model to continually changing target distributions.
Our proposed method attains state-of-the-art performance in both classification and segmentation CTTA tasks.
arXiv Detail & Related papers (2023-12-19T15:34:52Z) - Semantic Connectivity-Driven Pseudo-labeling for Cross-domain
Segmentation [89.41179071022121]
Self-training is a prevailing approach in cross-domain semantic segmentation.
We propose a novel approach called Semantic Connectivity-driven pseudo-labeling.
This approach formulates pseudo-labels at the connectivity level and thus can facilitate learning structured and low-noise semantics.
arXiv Detail & Related papers (2023-12-11T12:29:51Z) - Stochastic Segmentation with Conditional Categorical Diffusion Models [3.8168879948759953]
We propose a conditional categorical diffusion model (CCDM) for semantic segmentation based on Denoising Diffusion Probabilistic Models.
Our results show that CCDM achieves state-of-the-art performance on LIDC, and outperforms established baselines on the classical segmentation dataset Cityscapes.
arXiv Detail & Related papers (2023-03-15T19:16:47Z) - MAPS: A Noise-Robust Progressive Learning Approach for Source-Free
Domain Adaptive Keypoint Detection [76.97324120775475]
Cross-domain keypoint detection methods always require accessing the source data during adaptation.
This paper considers source-free domain adaptive keypoint detection, where only the well-trained source model is provided to the target domain.
arXiv Detail & Related papers (2023-02-09T12:06:08Z) - Learning Confident Classifiers in the Presence of Label Noise [5.829762367794509]
This paper proposes a probabilistic model for noisy observations that allows us to build a confident classification and segmentation models.
Our experiments show that our algorithm outperforms state-of-the-art solutions for the considered classification and segmentation problems.
arXiv Detail & Related papers (2023-01-02T04:27:25Z) - Towards Robust Adaptive Object Detection under Noisy Annotations [40.25050610617893]
Existing methods assume that the source domain labels are completely clean, yet large-scale datasets often contain error-prone annotations due to instance ambiguity.
We propose a Noise Latent Transferability Exploration framework to address this issue.
NLTE improves the mAP by 8.4% under 60% corrupted annotations and even approaches the ideal upper bound of training on a clean source dataset.
arXiv Detail & Related papers (2022-04-06T07:02:37Z) - ANL: Anti-Noise Learning for Cross-Domain Person Re-Identification [25.035093667770052]
We propose an Anti-Noise Learning (ANL) approach, which contains two modules.
FDA module is designed to gather the id-related samples and disperse id-unrelated samples, through the camera-wise contrastive learning and adversarial adaptation.
Reliable Sample Selection ( RSS) module utilizes an Auxiliary Model to correct noisy labels and select reliable samples for the Main Model.
arXiv Detail & Related papers (2020-12-27T02:38:45Z) - Extended T: Learning with Mixed Closed-set and Open-set Noisy Labels [86.5943044285146]
The label noise transition matrix $T$ reflects the probabilities that true labels flip into noisy ones.
In this paper, we focus on learning under the mixed closed-set and open-set label noise.
Our method can better model the mixed label noise, following its more robust performance than the prior state-of-the-art label-noise learning methods.
arXiv Detail & Related papers (2020-12-02T02:42:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.