Teacher-Student Competition for Unsupervised Domain Adaptation
- URL: http://arxiv.org/abs/2010.09572v2
- Date: Tue, 20 Oct 2020 03:37:22 GMT
- Title: Teacher-Student Competition for Unsupervised Domain Adaptation
- Authors: Ruixin Xiao, Zhilei Liu, Baoyuan Wu
- Abstract summary: This paper proposes an unsupervised domain adaptation approach with Teacher-Student Competition (TSC)
In particular, a student network is introduced to learn the target-specific feature space, and we design a novel competition mechanism to select more credible pseudo-labels for the training of student network.
Our proposed TSC framework significantly outperforms the state-of-the-art domain adaptation methods on Office-31 and ImageCLEF-DA benchmarks.
- Score: 28.734814582911845
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the supervision from source domain only in class-level, existing
unsupervised domain adaptation (UDA) methods mainly learn the domain-invariant
representations from a shared feature extractor, which causes the source-bias
problem. This paper proposes an unsupervised domain adaptation approach with
Teacher-Student Competition (TSC). In particular, a student network is
introduced to learn the target-specific feature space, and we design a novel
competition mechanism to select more credible pseudo-labels for the training of
student network. We introduce a teacher network with the structure of existing
conventional UDA method, and both teacher and student networks compete to
provide target pseudo-labels to constrain every target sample's training in
student network. Extensive experiments demonstrate that our proposed TSC
framework significantly outperforms the state-of-the-art domain adaptation
methods on Office-31 and ImageCLEF-DA benchmarks.
Related papers
- Direct Distillation between Different Domains [97.39470334253163]
We propose a new one-stage method dubbed Direct Distillation between Different Domains" (4Ds)
We first design a learnable adapter based on the Fourier transform to separate the domain-invariant knowledge from the domain-specific knowledge.
We then build a fusion-activation mechanism to transfer the valuable domain-invariant knowledge to the student network.
arXiv Detail & Related papers (2024-01-12T02:48:51Z) - Self-training through Classifier Disagreement for Cross-Domain Opinion
Target Extraction [62.41511766918932]
Opinion target extraction (OTE) or aspect extraction (AE) is a fundamental task in opinion mining.
Recent work focus on cross-domain OTE, which is typically encountered in real-world scenarios.
We propose a new SSL approach that opts for selecting target samples whose model output from a domain-specific teacher and student network disagrees on the unlabelled target data.
arXiv Detail & Related papers (2023-02-28T16:31:17Z) - Multi-level Consistency Learning for Semi-supervised Domain Adaptation [85.90600060675632]
Semi-supervised domain adaptation (SSDA) aims to apply knowledge learned from a fully labeled source domain to a scarcely labeled target domain.
We propose a Multi-level Consistency Learning framework for SSDA.
arXiv Detail & Related papers (2022-05-09T06:41:18Z) - One-Class Knowledge Distillation for Face Presentation Attack Detection [53.30584138746973]
This paper introduces a teacher-student framework to improve the cross-domain performance of face PAD with one-class domain adaptation.
Student networks are trained to mimic the teacher network and learn similar representations for genuine face samples of the target domain.
In the test phase, the similarity score between the representations of the teacher and student networks is used to distinguish attacks from genuine ones.
arXiv Detail & Related papers (2022-05-08T06:20:59Z) - UDA-COPE: Unsupervised Domain Adaptation for Category-level Object Pose
Estimation [84.16372642822495]
We propose an unsupervised domain adaptation (UDA) for category-level object pose estimation, called textbfUDA-COPE.
Inspired by the recent multi-modal UDA techniques, the proposed method exploits a teacher-student self-supervised learning scheme to train a pose estimation network without using target domain labels.
arXiv Detail & Related papers (2021-11-24T16:00:48Z) - Robust Ensembling Network for Unsupervised Domain Adaptation [20.152004296679138]
We propose a Robust Ensembling Network (REN) for unsupervised domain adaptation (UDA)
REN mainly includes a teacher network and a student network, which performs standard domain adaptation training and updates weights of the teacher network.
For the purpose of improving the basic ability of the student network, we utilize the consistency constraint to balance the error between the student network and the teacher network.
arXiv Detail & Related papers (2021-08-21T09:19:13Z) - Unsupervised Domain Adaptation for Image Classification via
Structure-Conditioned Adversarial Learning [70.79486026698419]
Unsupervised domain adaptation (UDA) typically carries out knowledge transfer from a label-rich source domain to an unlabeled target domain by adversarial learning.
We propose an end-to-end structure-conditioned adversarial learning scheme (SCAL) that is able to preserve the intra-class compactness during domain distribution alignment.
arXiv Detail & Related papers (2021-03-04T03:12:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.