Universal Source-Free Domain Adaptation
- URL: http://arxiv.org/abs/2004.04393v1
- Date: Thu, 9 Apr 2020 07:26:20 GMT
- Title: Universal Source-Free Domain Adaptation
- Authors: Jogendra Nath Kundu, Naveen Venkat, Rahul M V, R. Venkatesh Babu
- Abstract summary: We propose a novel two-stage learning process for domain adaptation.
In the Procurement stage, we aim to equip the model for future source-free deployment, assuming no prior knowledge of the upcoming category-gap and domain-shift.
In the Deployment stage, the goal is to design a unified adaptation algorithm capable of operating across a wide range of category-gaps.
- Score: 57.37520645827318
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: There is a strong incentive to develop versatile learning techniques that can
transfer the knowledge of class-separability from a labeled source domain to an
unlabeled target domain in the presence of a domain-shift. Existing domain
adaptation (DA) approaches are not equipped for practical DA scenarios as a
result of their reliance on the knowledge of source-target label-set
relationship (e.g. Closed-set, Open-set or Partial DA). Furthermore, almost all
prior unsupervised DA works require coexistence of source and target samples
even during deployment, making them unsuitable for real-time adaptation. Devoid
of such impractical assumptions, we propose a novel two-stage learning process.
1) In the Procurement stage, we aim to equip the model for future source-free
deployment, assuming no prior knowledge of the upcoming category-gap and
domain-shift. To achieve this, we enhance the model's ability to reject
out-of-source distribution samples by leveraging the available source data, in
a novel generative classifier framework. 2) In the Deployment stage, the goal
is to design a unified adaptation algorithm capable of operating across a wide
range of category-gaps, with no access to the previously seen source samples.
To this end, in contrast to the usage of complex adversarial training regimes,
we define a simple yet effective source-free adaptation objective by utilizing
a novel instance-level weighting mechanism, named as Source Similarity Metric
(SSM). A thorough evaluation shows the practical usability of the proposed
learning framework with superior DA performance even over state-of-the-art
source-dependent approaches.
Related papers
- Unified Source-Free Domain Adaptation [44.95240684589647]
In pursuit of transferring a source model to a target domain without access to the source training data, Source-Free Domain Adaptation (SFDA) has been extensively explored.
We propose a novel approach called Latent Causal Factors Discovery (LCFD)
In contrast to previous alternatives that emphasize learning the statistical description of reality, we formulate LCFD from a causality perspective.
arXiv Detail & Related papers (2024-03-12T12:40:08Z) - Balancing Discriminability and Transferability for Source-Free Domain
Adaptation [55.143687986324935]
Conventional domain adaptation (DA) techniques aim to improve domain transferability by learning domain-invariant representations.
The requirement of simultaneous access to labeled source and unlabeled target renders them unsuitable for the challenging source-free DA setting.
We derive novel insights to show that a mixup between original and corresponding translated generic samples enhances the discriminability-transferability trade-off.
arXiv Detail & Related papers (2022-06-16T09:06:22Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - UMAD: Universal Model Adaptation under Domain and Category Shift [138.12678159620248]
Universal Model ADaptation (UMAD) framework handles both UDA scenarios without access to source data.
We develop an informative consistency score to help distinguish unknown samples from known samples.
Experiments on open-set and open-partial-set UDA scenarios demonstrate that UMAD exhibits comparable, if not superior, performance to state-of-the-art data-dependent methods.
arXiv Detail & Related papers (2021-12-16T01:22:59Z) - Source-Free Domain Adaptation for Semantic Segmentation [11.722728148523366]
Unsupervised Domain Adaptation (UDA) can tackle the challenge that convolutional neural network-based approaches for semantic segmentation heavily rely on the pixel-level annotated data.
We propose a source-free domain adaptation framework for semantic segmentation, namely SFDA, in which only a well-trained source model and an unlabeled target domain dataset are available for adaptation.
arXiv Detail & Related papers (2021-03-30T14:14:29Z) - Towards Inheritable Models for Open-Set Domain Adaptation [56.930641754944915]
We introduce a practical Domain Adaptation paradigm where a source-trained model is used to facilitate adaptation in the absence of the source dataset in future.
We present an objective way to quantify inheritability to enable the selection of the most suitable source model for a given target domain, even in the absence of the source data.
arXiv Detail & Related papers (2020-04-09T07:16:30Z) - Do We Really Need to Access the Source Data? Source Hypothesis Transfer
for Unsupervised Domain Adaptation [102.67010690592011]
Unsupervised adaptationUDA (UDA) aims to leverage the knowledge learned from a labeled source dataset to solve similar tasks in a new unlabeled domain.
Prior UDA methods typically require to access the source data when learning to adapt the model.
This work tackles a practical setting where only a trained source model is available and how we can effectively utilize such a model without source data to solve UDA problems.
arXiv Detail & Related papers (2020-02-20T03:13:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.