Source-Free Progressive Graph Learning for Open-Set Domain Adaptation
- URL: http://arxiv.org/abs/2202.06174v1
- Date: Sun, 13 Feb 2022 01:19:41 GMT
- Title: Source-Free Progressive Graph Learning for Open-Set Domain Adaptation
- Authors: Yadan Luo, Zijian Wang, Zhuoxiao Chen, Zi Huang and Mahsa
Baktashmotlagh
- Abstract summary: Open-set domain adaptation (OSDA) has gained considerable attention in many visual recognition tasks.
We propose a Progressive Graph Learning (PGL) framework that decomposes the target hypothesis space into the shared and unknown subspaces.
We also tackle a more realistic source-free open-set domain adaptation (SF-OSDA) setting that makes no assumption about the coexistence of source and target domains.
- Score: 44.63301903324783
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Open-set domain adaptation (OSDA) has gained considerable attention in many
visual recognition tasks. However, most existing OSDA approaches are limited
due to three main reasons, including: (1) the lack of essential theoretical
analysis of generalization bound, (2) the reliance on the coexistence of source
and target data during adaptation, and (3) failing to accurately estimate the
uncertainty of model predictions. We propose a Progressive Graph Learning (PGL)
framework that decomposes the target hypothesis space into the shared and
unknown subspaces, and then progressively pseudo-labels the most confident
known samples from the target domain for hypothesis adaptation. Moreover, we
tackle a more realistic source-free open-set domain adaptation (SF-OSDA)
setting that makes no assumption about the coexistence of source and target
domains, and introduce a balanced pseudo-labeling (BP-L) strategy in a
two-stage framework, namely SF-PGL. Different from PGL that applies a
class-agnostic constant threshold for all target samples for pseudo-labeling,
the SF-PGL model uniformly selects the most confident target instances from
each category at a fixed ratio. The confidence thresholds in each class are
regarded as the 'uncertainty' of learning the semantic information, which are
then used to weigh the classification loss in the adaptation step. We conducted
unsupervised and semi-supervised OSDA and SF-OSDA experiments on the benchmark
image classification and action recognition datasets. Additionally, we find
that balanced pseudo-labeling plays a significant role in improving
calibration, which makes the trained model less prone to over-confident or
under-confident predictions on the target data. Source code is available at
https://github.com/Luoyadan/SF-PGL.
Related papers
- Uncertainty-guided Open-Set Source-Free Unsupervised Domain Adaptation with Target-private Class Segregation [22.474866164542302]
UDA approaches commonly assume that source and target domains share the same labels space.
This paper considers the more challenging Source-Free Open-set Domain Adaptation (SF-OSDA) setting.
We propose a novel approach for SF-OSDA that exploits the granularity of target-private categories by segregating their samples into multiple unknown classes.
arXiv Detail & Related papers (2024-04-16T13:52:00Z) - Consistency Regularization for Generalizable Source-free Domain
Adaptation [62.654883736925456]
Source-free domain adaptation (SFDA) aims to adapt a well-trained source model to an unlabelled target domain without accessing the source dataset.
Existing SFDA methods ONLY assess their adapted models on the target training set, neglecting the data from unseen but identically distributed testing sets.
We propose a consistency regularization framework to develop a more generalizable SFDA method.
arXiv Detail & Related papers (2023-08-03T07:45:53Z) - Chaos to Order: A Label Propagation Perspective on Source-Free Domain
Adaptation [8.27771856472078]
We present Chaos to Order (CtO), a novel approach for source-free domain adaptation (SFDA)
CtO strives to constrain semantic credibility and propagate label information among target subpopulations.
Empirical evidence demonstrates that CtO outperforms the state of the arts on three public benchmarks.
arXiv Detail & Related papers (2023-01-20T03:39:35Z) - Divide and Contrast: Source-free Domain Adaptation via Adaptive
Contrastive Learning [122.62311703151215]
Divide and Contrast (DaC) aims to connect the good ends of both worlds while bypassing their limitations.
DaC divides the target data into source-like and target-specific samples, where either group of samples is treated with tailored goals.
We further align the source-like domain with the target-specific samples using a memory bank-based Maximum Mean Discrepancy (MMD) loss to reduce the distribution mismatch.
arXiv Detail & Related papers (2022-11-12T09:21:49Z) - Uncertainty-guided Source-free Domain Adaptation [77.3844160723014]
Source-free domain adaptation (SFDA) aims to adapt a classifier to an unlabelled target data set by only using a pre-trained source model.
We propose quantifying the uncertainty in the source model predictions and utilizing it to guide the target adaptation.
arXiv Detail & Related papers (2022-08-16T08:03:30Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Source Data-absent Unsupervised Domain Adaptation through Hypothesis
Transfer and Labeling Transfer [137.36099660616975]
Unsupervised adaptation adaptation (UDA) aims to transfer knowledge from a related but different well-labeled source domain to a new unlabeled target domain.
Most existing UDA methods require access to the source data, and thus are not applicable when the data are confidential and not shareable due to privacy concerns.
This paper aims to tackle a realistic setting with only a classification model available trained over, instead of accessing to the source data.
arXiv Detail & Related papers (2020-12-14T07:28:50Z) - Progressive Graph Learning for Open-Set Domain Adaptation [48.758366879597965]
Domain shift is a fundamental problem in visual recognition which typically arises when the source and target data follow different distributions.
In this paper, we tackle a more realistic problem of open-set domain shift where the target data contains additional classes that are not present in the source data.
We introduce an end-to-end Progressive Graph Learning framework where a graph neural network with episodic training is integrated to suppress underlying conditional shift.
arXiv Detail & Related papers (2020-06-22T09:10:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.