Source-Free Unsupervised Domain Adaptation: A Survey
- URL: http://arxiv.org/abs/2301.00265v1
- Date: Sat, 31 Dec 2022 18:44:45 GMT
- Title: Source-Free Unsupervised Domain Adaptation: A Survey
- Authors: Yuqi Fang, Pew-Thian Yap, Weili Lin, Hongtu Zhu, and Mingxia Liu
- Abstract summary: Unsupervised domain adaptation (UDA) via deep learning has attracted appealing attention for tackling domain-shift problems.
Many source-free unsupervised domain adaptation (SFUDA) methods have been proposed recently, which perform knowledge transfer from a pre-trained source model to unlabeled target domain.
This paper provides a timely and systematic literature review of existing SFUDA approaches from a technical perspective.
- Score: 32.48017861767467
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised domain adaptation (UDA) via deep learning has attracted
appealing attention for tackling domain-shift problems caused by distribution
discrepancy across different domains. Existing UDA approaches highly depend on
the accessibility of source domain data, which is usually limited in practical
scenarios due to privacy protection, data storage and transmission cost, and
computation burden. To tackle this issue, many source-free unsupervised domain
adaptation (SFUDA) methods have been proposed recently, which perform knowledge
transfer from a pre-trained source model to unlabeled target domain with source
data inaccessible. A comprehensive review of these works on SFUDA is of great
significance. In this paper, we provide a timely and systematic literature
review of existing SFUDA approaches from a technical perspective. Specifically,
we categorize current SFUDA studies into two groups, i.e., white-box SFUDA and
black-box SFUDA, and further divide them into finer subcategories based on
different learning strategies they use. We also investigate the challenges of
methods in each subcategory, discuss the advantages/disadvantages of white-box
and black-box SFUDA methods, conclude the commonly used benchmark datasets, and
summarize the popular techniques for improved generalizability of models
learned without using source data. We finally discuss several promising future
directions in this field.
Related papers
- More is Better: Deep Domain Adaptation with Multiple Sources [34.26271755493111]
Multi-source domain adaptation (MDA) is a powerful and practical extension in which the labeled data may be collected from multiple sources with different distributions.
In this survey, we first define various MDA strategies. Then we systematically summarize and compare modern MDA methods in the deep learning era from different perspectives.
arXiv Detail & Related papers (2024-05-01T03:37:12Z) - UFDA: Universal Federated Domain Adaptation with Practical Assumptions [33.06684706053823]
This paper studies a more practical scenario named Universal Federated Domain Adaptation (UFDA)
It only requires the black-box model and the label set information of each source domain.
We propose a corresponding methodology called Hot-Learning with Contrastive Label Disambiguation (HCLD)
arXiv Detail & Related papers (2023-11-27T06:38:07Z) - A Comprehensive Survey on Source-free Domain Adaptation [69.17622123344327]
The research of Source-Free Domain Adaptation (SFDA) has drawn growing attention in recent years.
We provide a comprehensive survey of recent advances in SFDA and organize them into a unified categorization scheme.
We compare the results of more than 30 representative SFDA methods on three popular classification benchmarks.
arXiv Detail & Related papers (2023-02-23T06:32:09Z) - Learning Feature Decomposition for Domain Adaptive Monocular Depth
Estimation [51.15061013818216]
Supervised approaches have led to great success with the advance of deep learning, but they rely on large quantities of ground-truth depth annotations.
Unsupervised domain adaptation (UDA) transfers knowledge from labeled source data to unlabeled target data, so as to relax the constraint of supervised learning.
We propose a novel UDA method for MDE, referred to as Learning Feature Decomposition for Adaptation (LFDA), which learns to decompose the feature space into content and style components.
arXiv Detail & Related papers (2022-07-30T08:05:35Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - UMAD: Universal Model Adaptation under Domain and Category Shift [138.12678159620248]
Universal Model ADaptation (UMAD) framework handles both UDA scenarios without access to source data.
We develop an informative consistency score to help distinguish unknown samples from known samples.
Experiments on open-set and open-partial-set UDA scenarios demonstrate that UMAD exhibits comparable, if not superior, performance to state-of-the-art data-dependent methods.
arXiv Detail & Related papers (2021-12-16T01:22:59Z) - A Review of Single-Source Deep Unsupervised Visual Domain Adaptation [81.07994783143533]
Large-scale labeled training datasets have enabled deep neural networks to excel across a wide range of benchmark vision tasks.
In many applications, it is prohibitively expensive and time-consuming to obtain large quantities of labeled data.
To cope with limited labeled training data, many have attempted to directly apply models trained on a large-scale labeled source domain to another sparsely labeled or unlabeled target domain.
arXiv Detail & Related papers (2020-09-01T00:06:50Z) - Multi-source Domain Adaptation in the Deep Learning Era: A Systematic
Survey [53.656086832255944]
Multi-source domain adaptation (MDA) is a powerful extension in which the labeled data may be collected from multiple sources.
MDA has attracted increasing attention in both academia and industry.
arXiv Detail & Related papers (2020-02-26T08:07:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.