Inter-Domain Mixup for Semi-Supervised Domain Adaptation
- URL: http://arxiv.org/abs/2401.11453v1
- Date: Sun, 21 Jan 2024 10:20:46 GMT
- Title: Inter-Domain Mixup for Semi-Supervised Domain Adaptation
- Authors: Jichang Li, Guanbin Li, Yizhou Yu
- Abstract summary: Semi-supervised domain adaptation (SSDA) aims to bridge source and target domain distributions, with a small number of target labels available.
Existing SSDA work fails to make full use of label information from both source and target domains for feature alignment across domains.
This paper presents a novel SSDA approach, Inter-domain Mixup with Neighborhood Expansion (IDMNE), to tackle this issue.
- Score: 108.40945109477886
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Semi-supervised domain adaptation (SSDA) aims to bridge source and target
domain distributions, with a small number of target labels available, achieving
better classification performance than unsupervised domain adaptation (UDA).
However, existing SSDA work fails to make full use of label information from
both source and target domains for feature alignment across domains, resulting
in label mismatch in the label space during model testing. This paper presents
a novel SSDA approach, Inter-domain Mixup with Neighborhood Expansion (IDMNE),
to tackle this issue. Firstly, we introduce a cross-domain feature alignment
strategy, Inter-domain Mixup, that incorporates label information into model
adaptation. Specifically, we employ sample-level and manifold-level data mixing
to generate compatible training samples. These newly established samples,
combined with reliable and actual label information, display diversity and
compatibility across domains, while such extra supervision thus facilitates
cross-domain feature alignment and mitigates label mismatch. Additionally, we
utilize Neighborhood Expansion to leverage high-confidence pseudo-labeled
samples in the target domain, diversifying the label information of the target
domain and thereby further increasing the performance of the adaptation model.
Accordingly, the proposed approach outperforms existing state-of-the-art
methods, achieving significant accuracy improvements on popular SSDA
benchmarks, including DomainNet, Office-Home, and Office-31.
Related papers
- Evidential Graph Contrastive Alignment for Source-Free Blending-Target Domain Adaptation [3.0134158269410207]
We propose a new method called Evidential Contrastive Alignment (ECA) to decouple the blending target domain and alleviate the effect from noisy target pseudo labels.
ECA outperforms other methods with considerable gains and achieves comparable results compared with those that have domain labels or source data in prior.
arXiv Detail & Related papers (2024-08-14T13:02:20Z) - Adaptive Betweenness Clustering for Semi-Supervised Domain Adaptation [108.40945109477886]
We propose a novel SSDA approach named Graph-based Adaptive Betweenness Clustering (G-ABC) for achieving categorical domain alignment.
Our method outperforms previous state-of-the-art SSDA approaches, demonstrating the superiority of the proposed G-ABC algorithm.
arXiv Detail & Related papers (2024-01-21T09:57:56Z) - CLDA: Contrastive Learning for Semi-Supervised Domain Adaptation [1.2691047660244335]
Unsupervised Domain Adaptation (UDA) aims to align the labeled source distribution with the unlabeled target distribution to obtain domain invariant predictive models.
We propose Contrastive Learning framework for semi-supervised Domain Adaptation (CLDA) that attempts to bridge the intra-domain gap.
CLDA achieves state-of-the-art results on all the above datasets.
arXiv Detail & Related papers (2021-06-30T20:23:19Z) - Cross-domain Contrastive Learning for Unsupervised Domain Adaptation [108.63914324182984]
Unsupervised domain adaptation (UDA) aims to transfer knowledge learned from a fully-labeled source domain to a different unlabeled target domain.
We build upon contrastive self-supervised learning to align features so as to reduce the domain discrepancy between training and testing sets.
arXiv Detail & Related papers (2021-06-10T06:32:30Z) - Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation [85.6961770631173]
In semi-supervised domain adaptation, a few labeled samples per class in the target domain guide features of the remaining target samples to aggregate around them.
We propose a novel approach called Cross-domain Adaptive Clustering to address this problem.
arXiv Detail & Related papers (2021-04-19T16:07:32Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Your Classifier can Secretly Suffice Multi-Source Domain Adaptation [72.47706604261992]
Multi-Source Domain Adaptation (MSDA) deals with the transfer of task knowledge from multiple labeled source domains to an unlabeled target domain.
We present a different perspective to MSDA wherein deep models are observed to implicitly align the domains under label supervision.
arXiv Detail & Related papers (2021-03-20T12:44:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.