Sparsely-Labeled Source Assisted Domain Adaptation
- URL: http://arxiv.org/abs/2005.04111v1
- Date: Fri, 8 May 2020 15:37:35 GMT
- Title: Sparsely-Labeled Source Assisted Domain Adaptation
- Authors: Wei Wang, Zhihui Wang, Yuankai Xiang, Jing Sun, Haojie Li, Fuming Sun,
Zhengming Ding
- Abstract summary: This paper proposes a novel Sparsely-Labeled Source Assisted Domain Adaptation (SLSA-DA) algorithm.
Due to the label scarcity problem, the projected clustering is conducted on both the source and target domains.
- Score: 64.75698236688729
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Domain Adaptation (DA) aims to generalize the classifier learned from the
source domain to the target domain. Existing DA methods usually assume that
rich labels could be available in the source domain. However, there are usually
a large number of unlabeled data but only a few labeled data in the source
domain, and how to transfer knowledge from this sparsely-labeled source domain
to the target domain is still a challenge, which greatly limits their
application in the wild. This paper proposes a novel Sparsely-Labeled Source
Assisted Domain Adaptation (SLSA-DA) algorithm to address the challenge with
limited labeled source domain samples. Specifically, due to the label scarcity
problem, the projected clustering is conducted on both the source and target
domains, so that the discriminative structures of data could be leveraged
elegantly. Then the label propagation is adopted to propagate the labels from
those limited labeled source samples to the whole unlabeled data progressively,
so that the cluster labels are revealed correctly. Finally, we jointly align
the marginal and conditional distributions to mitigate the cross-domain
mismatch problem, and optimize those three procedures iteratively. However, it
is nontrivial to incorporate those three procedures into a unified optimization
framework seamlessly since some variables to be optimized are implicitly
involved in their formulas, thus they could not promote to each other.
Remarkably, we prove that the projected clustering and conditional distribution
alignment could be reformulated as different expressions, thus the implicit
variables are revealed in different optimization steps. As such, the variables
related to those three quantities could be optimized in a unified optimization
framework and facilitate to each other, to improve the recognition performance
obviously.
Related papers
- Noisy Universal Domain Adaptation via Divergence Optimization for Visual
Recognition [30.31153237003218]
A novel scenario named Noisy UniDA is proposed to transfer knowledge from a labeled source domain to an unlabeled target domain.
A multi-head convolutional neural network framework is proposed to address all of the challenges faced in the Noisy UniDA at once.
arXiv Detail & Related papers (2023-04-20T14:18:38Z) - Cross-Domain Label Propagation for Domain Adaptation with Discriminative
Graph Self-Learning [8.829109854586573]
Domain adaptation manages to transfer the knowledge of well-labeled source data to unlabeled target data.
We propose a novel domain adaptation method, which infers target pseudo-labels through cross-domain label propagation.
arXiv Detail & Related papers (2023-02-17T05:55:32Z) - Cycle Label-Consistent Networks for Unsupervised Domain Adaptation [57.29464116557734]
Domain adaptation aims to leverage a labeled source domain to learn a classifier for the unlabeled target domain with a different distribution.
We propose a simple yet efficient domain adaptation method, i.e. Cycle Label-Consistent Network (CLCN), by exploiting the cycle consistency of classification label.
We demonstrate the effectiveness of our approach on MNIST-USPS-SVHN, Office-31, Office-Home and Image CLEF-DA benchmarks.
arXiv Detail & Related papers (2022-05-27T13:09:08Z) - CA-UDA: Class-Aware Unsupervised Domain Adaptation with Optimal
Assignment and Pseudo-Label Refinement [84.10513481953583]
unsupervised domain adaptation (UDA) focuses on the selection of good pseudo-labels as surrogates for the missing labels in the target data.
source domain bias that deteriorates the pseudo-labels can still exist since the shared network of the source and target domains are typically used for the pseudo-label selections.
We propose CA-UDA to improve the quality of the pseudo-labels and UDA results with optimal assignment, a pseudo-label refinement strategy and class-aware domain alignment.
arXiv Detail & Related papers (2022-05-26T18:45:04Z) - Source-Free Domain Adaptation via Distribution Estimation [106.48277721860036]
Domain Adaptation aims to transfer the knowledge learned from a labeled source domain to an unlabeled target domain whose data distributions are different.
Recently, Source-Free Domain Adaptation (SFDA) has drawn much attention, which tries to tackle domain adaptation problem without using source data.
In this work, we propose a novel framework called SFDA-DE to address SFDA task via source Distribution Estimation.
arXiv Detail & Related papers (2022-04-24T12:22:19Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Divergence Optimization for Noisy Universal Domain Adaptation [32.05829135903389]
Universal domain adaptation (UniDA) has been proposed to transfer knowledge learned from a label-rich source domain to a label-scarce target domain.
This paper introduces a two-head convolutional neural network framework to solve all problems simultaneously.
arXiv Detail & Related papers (2021-04-01T04:16:04Z) - Learning Target Domain Specific Classifier for Partial Domain Adaptation [85.71584004185031]
Unsupervised domain adaptation (UDA) aims at reducing the distribution discrepancy when transferring knowledge from a labeled source domain to an unlabeled target domain.
This paper focuses on a more realistic UDA scenario, where the target label space is subsumed to the source label space.
arXiv Detail & Related papers (2020-08-25T02:28:24Z) - Domain Adaptation with Conditional Distribution Matching and Generalized
Label Shift [20.533804144992207]
Adversarial learning has demonstrated good performance in the unsupervised domain adaptation setting.
We propose a new assumption, generalized label shift ($GLS$), to improve robustness against mismatched label distributions.
Our algorithms outperform the base versions, with vast improvements for large label distribution mismatches.
arXiv Detail & Related papers (2020-03-10T00:35:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.