Generalized Source-free Domain Adaptation
- URL: http://arxiv.org/abs/2108.01614v1
- Date: Tue, 3 Aug 2021 16:34:12 GMT
- Title: Generalized Source-free Domain Adaptation
- Authors: Shiqi Yang, Yaxing Wang, Joost van de Weijer, Luis Herranz, Shangling
Jui
- Abstract summary: We propose a new domain adaptation paradigm called Generalized Source-free Domain Adaptation (G-SFDA)
For target performance our method is on par with or better than existing DA and SFDA methods, specifically it achieves state-of-the-art performance (85.4%) on VisDA.
- Score: 47.907168218249694
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Domain adaptation (DA) aims to transfer the knowledge learned from a source
domain to an unlabeled target domain. Some recent works tackle source-free
domain adaptation (SFDA) where only a source pre-trained model is available for
adaptation to the target domain. However, those methods do not consider keeping
source performance which is of high practical value in real world applications.
In this paper, we propose a new domain adaptation paradigm called Generalized
Source-free Domain Adaptation (G-SFDA), where the learned model needs to
perform well on both the target and source domains, with only access to current
unlabeled target data during adaptation. First, we propose local structure
clustering (LSC), aiming to cluster the target features with its semantically
similar neighbors, which successfully adapts the model to the target domain in
the absence of source data. Second, we propose sparse domain attention (SDA),
it produces a binary domain specific attention to activate different feature
channels for different domains, meanwhile the domain attention will be utilized
to regularize the gradient during adaptation to keep source information. In the
experiments, for target performance our method is on par with or better than
existing DA and SFDA methods, specifically it achieves state-of-the-art
performance (85.4%) on VisDA, and our method works well for all domains after
adapting to single or multiple target domains. Code is available in
https://github.com/Albert0147/G-SFDA.
Related papers
- Style Adaptation for Domain-adaptive Semantic Segmentation [2.1365683052370046]
Domain discrepancy leads to a significant decrease in the performance of general network models trained on the source domain data when applied to the target domain.
We introduce a straightforward approach to mitigate the domain discrepancy, which necessitates no additional parameter calculations and seamlessly integrates with self-training-based UDA methods.
Our proposed method attains a noteworthy UDA performance of 76.93 mIoU on the GTA->Cityscapes dataset, representing a notable improvement of +1.03 percentage points over the previous state-of-the-art results.
arXiv Detail & Related papers (2024-04-25T02:51:55Z) - AdAM: Few-Shot Image Generation via Adaptation-Aware Kernel Modulation [71.58154388819887]
Few-shot image generation (F SIG) aims to generate new and diverse images given few (e.g., 10) training samples.
Recent work has addressed F SIG by leveraging a GAN pre-trained on a large-scale source domain and adapting it to the target domain with few target samples.
We propose Adaptation-Aware kernel Modulation (AdAM) for general F SIG of different source-target domain proximity.
arXiv Detail & Related papers (2023-07-04T03:56:43Z) - Few-shot Image Generation via Adaptation-Aware Kernel Modulation [33.191479192580275]
Few-shot image generation (F SIG) aims to generate new and diverse samples given an extremely limited number of samples from a domain.
Recent work has addressed the problem using transfer learning approach, leveraging a GAN pretrained on a large-scale source domain dataset.
We propose Adaptation-Aware kernel Modulation (AdAM) to address general F SIG of different source-target domain proximity.
arXiv Detail & Related papers (2022-10-29T10:26:40Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - VDM-DA: Virtual Domain Modeling for Source Data-free Domain Adaptation [26.959377850768423]
Domain adaptation aims to leverage a label-rich domain (the source domain) to help model learning in a label-scarce domain (the target domain)
Access to the source domain samples may not always be feasible in the real world applications due to different problems.
We propose a novel approach referred to as Virtual Domain Modeling (VDM-DA)
arXiv Detail & Related papers (2021-03-26T09:56:40Z) - Effective Label Propagation for Discriminative Semi-Supervised Domain
Adaptation [76.41664929948607]
Semi-supervised domain adaptation (SSDA) methods have demonstrated great potential in large-scale image classification tasks.
We present a novel and effective method to tackle this problem by using effective inter-domain and intra-domain semantic information propagation.
Our source code and pre-trained models will be released soon.
arXiv Detail & Related papers (2020-12-04T14:28:19Z) - Cross-domain Self-supervised Learning for Domain Adaptation with Few
Source Labels [78.95901454696158]
We propose a novel Cross-Domain Self-supervised learning approach for domain adaptation.
Our method significantly boosts performance of target accuracy in the new target domain with few source labels.
arXiv Detail & Related papers (2020-03-18T15:11:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.