Associative Partial Domain Adaptation
- URL: http://arxiv.org/abs/2008.03111v1
- Date: Fri, 7 Aug 2020 12:15:38 GMT
- Title: Associative Partial Domain Adaptation
- Authors: Youngeun Kim, Sungeun Hong, Seunghan Yang, Sungil Kang, Yunho Jeon,
Jiwon Kim
- Abstract summary: Partial Adaptation (PDA) addresses a practical scenario in which the target domain contains only a subset of classes in the source domain.
We propose a novel approach to fully exploit multi-level associations that can arise in PDA.
- Score: 13.383299097180362
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Partial Adaptation (PDA) addresses a practical scenario in which the target
domain contains only a subset of classes in the source domain. While PDA should
take into account both class-level and sample-level to mitigate negative
transfer, current approaches mostly rely on only one of them. In this paper, we
propose a novel approach to fully exploit multi-level associations that can
arise in PDA. Our Associative Partial Domain Adaptation (APDA) utilizes
intra-domain association to actively select out non-trivial anomaly samples in
each source-private class that sample-level weighting cannot handle.
Additionally, our method considers inter-domain association to encourage
positive transfer by mapping between nearby target samples and source samples
with high label-commonness. For this, we exploit feature propagation in a
proposed label space consisting of source ground-truth labels and target
probabilistic labels. We further propose a geometric guidance loss based on the
label commonness of each source class to encourage positive transfer. Our APDA
consistently achieves state-of-the-art performance across public datasets.
Related papers
- Adaptive Betweenness Clustering for Semi-Supervised Domain Adaptation [108.40945109477886]
We propose a novel SSDA approach named Graph-based Adaptive Betweenness Clustering (G-ABC) for achieving categorical domain alignment.
Our method outperforms previous state-of-the-art SSDA approaches, demonstrating the superiority of the proposed G-ABC algorithm.
arXiv Detail & Related papers (2024-01-21T09:57:56Z) - Semi-supervised Domain Adaptation via Prototype-based Multi-level
Learning [4.232614032390374]
In semi-supervised domain adaptation (SSDA), a few labeled target samples of each class help the model to transfer knowledge representation from the fully labeled source domain to the target domain.
We propose a Prototype-based Multi-level Learning (ProML) framework to better tap the potential of labeled target samples.
arXiv Detail & Related papers (2023-05-04T10:09:30Z) - CLDA: Contrastive Learning for Semi-Supervised Domain Adaptation [1.2691047660244335]
Unsupervised Domain Adaptation (UDA) aims to align the labeled source distribution with the unlabeled target distribution to obtain domain invariant predictive models.
We propose Contrastive Learning framework for semi-supervised Domain Adaptation (CLDA) that attempts to bridge the intra-domain gap.
CLDA achieves state-of-the-art results on all the above datasets.
arXiv Detail & Related papers (2021-06-30T20:23:19Z) - Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation [85.6961770631173]
In semi-supervised domain adaptation, a few labeled samples per class in the target domain guide features of the remaining target samples to aggregate around them.
We propose a novel approach called Cross-domain Adaptive Clustering to address this problem.
arXiv Detail & Related papers (2021-04-19T16:07:32Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Your Classifier can Secretly Suffice Multi-Source Domain Adaptation [72.47706604261992]
Multi-Source Domain Adaptation (MSDA) deals with the transfer of task knowledge from multiple labeled source domains to an unlabeled target domain.
We present a different perspective to MSDA wherein deep models are observed to implicitly align the domains under label supervision.
arXiv Detail & Related papers (2021-03-20T12:44:13Z) - Adaptively-Accumulated Knowledge Transfer for Partial Domain Adaptation [66.74638960925854]
Partial domain adaptation (PDA) deals with a realistic and challenging problem when the source domain label space substitutes the target domain.
We propose an Adaptively-Accumulated Knowledge Transfer framework (A$2$KT) to align the relevant categories across two domains.
arXiv Detail & Related papers (2020-08-27T00:53:43Z) - Class Conditional Alignment for Partial Domain Adaptation [10.506584969668792]
Adrial adaptation models have demonstrated significant progress towards transferring knowledge from a labeled source dataset to an unlabeled target dataset.
PDA investigates the scenarios in which the source domain is large and diverse, and the target label space is a subset of the source label space.
We propose a multi-class adversarial architecture for PDA.
arXiv Detail & Related papers (2020-03-14T23:51:57Z) - Towards Fair Cross-Domain Adaptation via Generative Learning [50.76694500782927]
Domain Adaptation (DA) targets at adapting a model trained over the well-labeled source domain to the unlabeled target domain lying in different distributions.
We develop a novel Generative Few-shot Cross-domain Adaptation (GFCA) algorithm for fair cross-domain classification.
arXiv Detail & Related papers (2020-03-04T23:25:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.