Unsupervised Domain Adaptation in the Absence of Source Data
- URL: http://arxiv.org/abs/2007.10233v1
- Date: Mon, 20 Jul 2020 16:22:14 GMT
- Title: Unsupervised Domain Adaptation in the Absence of Source Data
- Authors: Roshni Sahoo, Divya Shanmugam, John Guttag
- Abstract summary: We propose an unsupervised method for adapting a source classifier to a target domain that varies from the source domain along natural axes.
We validate our method in scenarios where the distribution shift involves brightness, contrast, and rotation and show that it outperforms fine-tuning baselines in scenarios with limited labeled data.
- Score: 0.7366405857677227
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current unsupervised domain adaptation methods can address many types of
distribution shift, but they assume data from the source domain is freely
available. As the use of pre-trained models becomes more prevalent, it is
reasonable to assume that source data is unavailable. We propose an
unsupervised method for adapting a source classifier to a target domain that
varies from the source domain along natural axes, such as brightness and
contrast. Our method only requires access to unlabeled target instances and the
source classifier. We validate our method in scenarios where the distribution
shift involves brightness, contrast, and rotation and show that it outperforms
fine-tuning baselines in scenarios with limited labeled data.
Related papers
- Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Source-Free Domain Adaptive Fundus Image Segmentation with Denoised
Pseudo-Labeling [56.98020855107174]
Domain adaptation typically requires to access source domain data to utilize their distribution information for domain alignment with the target data.
In many real-world scenarios, the source data may not be accessible during the model adaptation in the target domain due to privacy issue.
We present a novel denoised pseudo-labeling method for this problem, which effectively makes use of the source model and unlabeled target data.
arXiv Detail & Related papers (2021-09-19T06:38:21Z) - Uncertainty-Guided Mixup for Semi-Supervised Domain Adaptation without
Source Data [37.26484185691251]
Source-free domain adaptation aims to solve the problem by performing domain adaptation without accessing the source data.
We propose uncertainty-guided Mixup to reduce the representation's intra-domain discrepancy and perform inter-domain alignment without directly accessing the source data.
Our method outperforms the recent semi-supervised baselines and the unsupervised variant achieves competitive performance.
arXiv Detail & Related papers (2021-07-14T13:54:02Z) - Unsupervised Multi-source Domain Adaptation Without Access to Source
Data [58.551861130011886]
Unsupervised Domain Adaptation (UDA) aims to learn a predictor model for an unlabeled domain by transferring knowledge from a separate labeled source domain.
We propose a novel and efficient algorithm which automatically combines the source models with suitable weights in such a way that it performs at least as good as the best source model.
arXiv Detail & Related papers (2021-04-05T10:45:12Z) - Domain Impression: A Source Data Free Domain Adaptation Method [27.19677042654432]
Unsupervised domain adaptation methods solve the adaptation problem for an unlabeled target set, assuming that the source dataset is available with all labels.
This paper proposes a domain adaptation technique that does not need any source data.
Instead of the source data, we are only provided with a classifier that is trained on the source data.
arXiv Detail & Related papers (2021-02-17T19:50:49Z) - Source-free Domain Adaptation via Distributional Alignment by Matching
Batch Normalization Statistics [85.75352990739154]
We propose a novel domain adaptation method for the source-free setting.
We use batch normalization statistics stored in the pretrained model to approximate the distribution of unobserved source data.
Our method achieves competitive performance with state-of-the-art domain adaptation methods.
arXiv Detail & Related papers (2021-01-19T14:22:33Z) - Open-Set Hypothesis Transfer with Semantic Consistency [99.83813484934177]
We introduce a method that focuses on the semantic consistency under transformation of target data.
Our model first discovers confident predictions and performs classification with pseudo-labels.
As a result, unlabeled data can be classified into discriminative classes coincided with either source classes or unknown classes.
arXiv Detail & Related papers (2020-10-01T10:44:31Z) - Do We Really Need to Access the Source Data? Source Hypothesis Transfer
for Unsupervised Domain Adaptation [102.67010690592011]
Unsupervised adaptationUDA (UDA) aims to leverage the knowledge learned from a labeled source dataset to solve similar tasks in a new unlabeled domain.
Prior UDA methods typically require to access the source data when learning to adapt the model.
This work tackles a practical setting where only a trained source model is available and how we can effectively utilize such a model without source data to solve UDA problems.
arXiv Detail & Related papers (2020-02-20T03:13:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.