Gradual Domain Adaptation: Theory and Algorithms
- URL: http://arxiv.org/abs/2310.13852v1
- Date: Fri, 20 Oct 2023 23:02:08 GMT
- Title: Gradual Domain Adaptation: Theory and Algorithms
- Authors: Yifei He, Haoxiang Wang, Bo Li, Han Zhao
- Abstract summary: Unsupervised domain adaptation (UDA) adapts a model from a labeled source domain to an unlabeled target domain in a one-off way.
In this work, we first theoretically analyze gradual self-training, a popular GDA algorithm, and provide a significantly improved generalization bound.
We propose $textbfG$enerative Gradual D$textbfO$main $textbfA$daptation with Optimal $textbfT$ransport (GOAT)
- Score: 15.278170387810409
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised domain adaptation (UDA) adapts a model from a labeled source
domain to an unlabeled target domain in a one-off way. Though widely applied,
UDA faces a great challenge whenever the distribution shift between the source
and the target is large. Gradual domain adaptation (GDA) mitigates this
limitation by using intermediate domains to gradually adapt from the source to
the target domain. In this work, we first theoretically analyze gradual
self-training, a popular GDA algorithm, and provide a significantly improved
generalization bound compared with Kumar et al. (2020). Our theoretical
analysis leads to an interesting insight: to minimize the generalization error
on the target domain, the sequence of intermediate domains should be placed
uniformly along the Wasserstein geodesic between the source and target domains.
The insight is particularly useful under the situation where intermediate
domains are missing or scarce, which is often the case in real-world
applications. Based on the insight, we propose $\textbf{G}$enerative Gradual
D$\textbf{O}$main $\textbf{A}$daptation with Optimal $\textbf{T}$ransport
(GOAT), an algorithmic framework that can generate intermediate domains in a
data-dependent way. More concretely, we first generate intermediate domains
along the Wasserstein geodesic between two given consecutive domains in a
feature space, then apply gradual self-training to adapt the source-trained
classifier to the target along the sequence of intermediate domains.
Empirically, we demonstrate that our GOAT framework can improve the performance
of standard GDA when the given intermediate domains are scarce, significantly
broadening the real-world application scenarios of GDA. Our code is available
at https://github.com/yifei-he/GOAT.
Related papers
- Improving Domain Adaptation Through Class Aware Frequency Transformation [15.70058524548143]
Most of the Unsupervised Domain Adaptation (UDA) algorithms focus on reducing the global domain shift between labelled source and unlabelled target domains.
We propose a novel approach based on traditional image processing technique Class Aware Frequency Transformation (CAFT)
CAFT utilizes pseudo label based class consistent low-frequency swapping for improving the overall performance of the existing UDA algorithms.
arXiv Detail & Related papers (2024-07-28T18:16:41Z) - Gradual Domain Adaptation without Indexed Intermediate Domains [23.726336635748783]
We propose a coarse-to-fine framework to discover the sequence of intermediate domains.
We show that our approach can lead to comparable or even better adaptation performance compared to the pre-defined domain sequence.
arXiv Detail & Related papers (2022-07-11T02:25:39Z) - Gradual Domain Adaptation via Normalizing Flows [2.7467053150385956]
A large gap exists between the source and target domains.
Gradual domain adaptation is one of the approaches used to address the problem.
We propose the use of normalizing flows to deal with this problem.
arXiv Detail & Related papers (2022-06-23T06:24:50Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Understanding Gradual Domain Adaptation: Improved Analysis, Optimal Path
and Beyond [20.518134448156744]
Gradual domain adaptation (GDA) assumes a path of $(T-1)$ unlabeled intermediate domains bridging the source and target.
We prove a significantly improved generalization bound as $widetildeOleft(varepsilon_0+Oleft(sqrtlog(T)/nright)$, where $Delta$ is the average distributional distance between consecutive domains.
arXiv Detail & Related papers (2022-04-18T07:39:23Z) - Domain-Agnostic Prior for Transfer Semantic Segmentation [197.9378107222422]
Unsupervised domain adaptation (UDA) is an important topic in the computer vision community.
We present a mechanism that regularizes cross-domain representation learning with a domain-agnostic prior (DAP)
Our research reveals that UDA benefits much from better proxies, possibly from other data modalities.
arXiv Detail & Related papers (2022-04-06T09:13:25Z) - IDM: An Intermediate Domain Module for Domain Adaptive Person Re-ID [58.46907388691056]
We argue that the bridging between the source and target domains can be utilized to tackle the UDA re-ID task.
We propose an Intermediate Domain Module (IDM) to generate intermediate domains' representations on-the-fly.
Our proposed method outperforms the state-of-the-arts by a large margin in all the common UDA re-ID tasks.
arXiv Detail & Related papers (2021-08-05T07:19:46Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Unsupervised Model Adaptation for Continual Semantic Segmentation [15.820660013260584]
We develop an algorithm for adapting a semantic segmentation model that is trained using a labeled source domain to generalize well in an unlabeled target domain.
We provide theoretical analysis and explain conditions under which our algorithm is effective.
Experiments on benchmark adaptation task demonstrate our method achieves competitive performance even compared with joint UDA approaches.
arXiv Detail & Related papers (2020-09-26T04:55:50Z) - Domain Conditioned Adaptation Network [90.63261870610211]
We propose a Domain Conditioned Adaptation Network (DCAN) to excite distinct convolutional channels with a domain conditioned channel attention mechanism.
This is the first work to explore the domain-wise convolutional channel activation for deep DA networks.
arXiv Detail & Related papers (2020-05-14T04:23:24Z) - Mind the Gap: Enlarging the Domain Gap in Open Set Domain Adaptation [65.38975706997088]
Open set domain adaptation (OSDA) assumes the presence of unknown classes in the target domain.
We show that existing state-of-the-art methods suffer a considerable performance drop in the presence of larger domain gaps.
We propose a novel framework to specifically address the larger domain gaps.
arXiv Detail & Related papers (2020-03-08T14:20:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.