Strong-Weak Integrated Semi-supervision for Unsupervised Single and
Multi Target Domain Adaptation
- URL: http://arxiv.org/abs/2309.06528v1
- Date: Tue, 12 Sep 2023 19:08:54 GMT
- Title: Strong-Weak Integrated Semi-supervision for Unsupervised Single and
Multi Target Domain Adaptation
- Authors: Xiaohu Lu and Hayder Radha
- Abstract summary: Unsupervised domain adaptation (UDA) focuses on transferring knowledge learned in the labeled source domain to the unlabeled target domain.
In this paper, we propose a novel strong-weak integrated semi-supervision (SWISS) learning strategy for image classification.
- Score: 6.472434306724611
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Unsupervised domain adaptation (UDA) focuses on transferring knowledge
learned in the labeled source domain to the unlabeled target domain. Despite
significant progress that has been achieved in single-target domain adaptation
for image classification in recent years, the extension from single-target to
multi-target domain adaptation is still a largely unexplored problem area. In
general, unsupervised domain adaptation faces a major challenge when attempting
to learn reliable information from a single unlabeled target domain. Increasing
the number of unlabeled target domains further exacerbate the problem rather
significantly. In this paper, we propose a novel strong-weak integrated
semi-supervision (SWISS) learning strategy for image classification using
unsupervised domain adaptation that works well for both single-target and
multi-target scenarios. Under the proposed SWISS-UDA framework, a strong
representative set with high confidence but low diversity target domain samples
and a weak representative set with low confidence but high diversity target
domain samples are updated constantly during the training process. Both sets
are fused to generate an augmented strong-weak training batch with
pseudo-labels to train the network during every iteration. The extension from
single-target to multi-target domain adaptation is accomplished by exploring
the class-wise distance relationship between domains and replacing the strong
representative set with much stronger samples from peer domains via peer
scaffolding. Moreover, a novel adversarial logit loss is proposed to reduce the
intra-class divergence between source and target domains, which is
back-propagated adversarially with a gradient reverse layer between the
classifier and the rest of the network. Experimental results based on three
benchmarks, Office-31, Office-Home, and DomainNet, show the effectiveness of
the proposed SWISS framework.
Related papers
- Contrastive Adversarial Training for Unsupervised Domain Adaptation [2.432037584128226]
Domain adversarial training has been successfully adopted for various domain adaptation tasks.
Large models make adversarial training being easily biased towards source domain and hardly adapted to target domain.
We propose contrastive adversarial training (CAT) approach that leverages the labeled source domain samples to reinforce and regulate the feature generation for target domain.
arXiv Detail & Related papers (2024-07-17T17:59:21Z) - Revisiting the Domain Shift and Sample Uncertainty in Multi-source
Active Domain Transfer [69.82229895838577]
Active Domain Adaptation (ADA) aims to maximally boost model adaptation in a new target domain by actively selecting a limited number of target data to annotate.
This setting neglects the more practical scenario where training data are collected from multiple sources.
This motivates us to target a new and challenging setting of knowledge transfer that extends ADA from a single source domain to multiple source domains.
arXiv Detail & Related papers (2023-11-21T13:12:21Z) - Domain Adaptive Few-Shot Open-Set Learning [36.39622440120531]
We propose Domain Adaptive Few-Shot Open Set Recognition (DA-FSOS) and introduce a meta-learning-based architecture named DAFOSNET.
Our training approach ensures that DAFOS-NET can generalize well to new scenarios in the target domain.
We present three benchmarks for DA-FSOS based on the Office-Home, mini-ImageNet/CUB, and DomainNet datasets.
arXiv Detail & Related papers (2023-09-22T12:04:47Z) - Adversarial Bi-Regressor Network for Domain Adaptive Regression [52.5168835502987]
It is essential to learn a cross-domain regressor to mitigate the domain shift.
This paper proposes a novel method Adversarial Bi-Regressor Network (ABRNet) to seek more effective cross-domain regression model.
arXiv Detail & Related papers (2022-09-20T18:38:28Z) - From Big to Small: Adaptive Learning to Partial-Set Domains [94.92635970450578]
Domain adaptation targets at knowledge acquisition and dissemination from a labeled source domain to an unlabeled target domain under distribution shift.
Recent advances show that deep pre-trained models of large scale endow rich knowledge to tackle diverse downstream tasks of small scale.
This paper introduces Partial Domain Adaptation (PDA), a learning paradigm that relaxes the identical class space assumption to that the source class space subsumes the target class space.
arXiv Detail & Related papers (2022-03-14T07:02:45Z) - Reiterative Domain Aware Multi-target Adaptation [14.352214079374463]
We propose Reiterative D-CGCT (RD-CGCT) that obtains better adaptation performance by reiterating multiple times over each target domain.
RD-CGCT significantly improves the performance over D-CGCT for Office-Home and Office31 datasets.
arXiv Detail & Related papers (2021-08-26T17:12:25Z) - Multi-Target Adversarial Frameworks for Domain Adaptation in Semantic
Segmentation [32.39557675340562]
We address the task of unsupervised domain adaptation (UDA) for semantic segmentation in presence of multiple target domains.
We introduce two adversarial frameworks: (i) multi-discriminator, which explicitly aligns each target domain to its counterparts, and (ii) multi-target knowledge transfer, which learns a target-agnostic model.
In all tested scenarios, our approaches consistently outperform baselines, setting competitive standards for the novel task.
arXiv Detail & Related papers (2021-08-16T08:36:10Z) - CLDA: Contrastive Learning for Semi-Supervised Domain Adaptation [1.2691047660244335]
Unsupervised Domain Adaptation (UDA) aims to align the labeled source distribution with the unlabeled target distribution to obtain domain invariant predictive models.
We propose Contrastive Learning framework for semi-supervised Domain Adaptation (CLDA) that attempts to bridge the intra-domain gap.
CLDA achieves state-of-the-art results on all the above datasets.
arXiv Detail & Related papers (2021-06-30T20:23:19Z) - AFAN: Augmented Feature Alignment Network for Cross-Domain Object
Detection [90.18752912204778]
Unsupervised domain adaptation for object detection is a challenging problem with many real-world applications.
We propose a novel augmented feature alignment network (AFAN) which integrates intermediate domain image generation and domain-adversarial training.
Our approach significantly outperforms the state-of-the-art methods on standard benchmarks for both similar and dissimilar domain adaptations.
arXiv Detail & Related papers (2021-06-10T05:01:20Z) - Discriminative Cross-Domain Feature Learning for Partial Domain
Adaptation [70.45936509510528]
Partial domain adaptation aims to adapt knowledge from a larger and more diverse source domain to a smaller target domain with less number of classes.
Recent practice on domain adaptation manages to extract effective features by incorporating the pseudo labels for the target domain.
It is essential to align target data with only a small set of source data.
arXiv Detail & Related papers (2020-08-26T03:18:53Z) - Alleviating Semantic-level Shift: A Semi-supervised Domain Adaptation
Method for Semantic Segmentation [97.8552697905657]
A key challenge of this task is how to alleviate the data distribution discrepancy between the source and target domains.
We propose Alleviating Semantic-level Shift (ASS), which can successfully promote the distribution consistency from both global and local views.
We apply our ASS to two domain adaptation tasks, from GTA5 to Cityscapes and from Synthia to Cityscapes.
arXiv Detail & Related papers (2020-04-02T03:25:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.