Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation
- URL: http://arxiv.org/abs/2104.09415v1
- Date: Mon, 19 Apr 2021 16:07:32 GMT
- Title: Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation
- Authors: Jichang Li, Guanbin Li, Yemin Shi, Yizhou Yu
- Abstract summary: In semi-supervised domain adaptation, a few labeled samples per class in the target domain guide features of the remaining target samples to aggregate around them.
We propose a novel approach called Cross-domain Adaptive Clustering to address this problem.
- Score: 85.6961770631173
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In semi-supervised domain adaptation, a few labeled samples per class in the
target domain guide features of the remaining target samples to aggregate
around them. However, the trained model cannot produce a highly discriminative
feature representation for the target domain because the training data is
dominated by labeled samples from the source domain. This could lead to
disconnection between the labeled and unlabeled target samples as well as
misalignment between unlabeled target samples and the source domain. In this
paper, we propose a novel approach called Cross-domain Adaptive Clustering to
address this problem. To achieve both inter-domain and intra-domain adaptation,
we first introduce an adversarial adaptive clustering loss to group features of
unlabeled target data into clusters and perform cluster-wise feature alignment
across the source and target domains. We further apply pseudo labeling to
unlabeled samples in the target domain and retain pseudo-labels with high
confidence. Pseudo labeling expands the number of ``labeled" samples in each
class in the target domain, and thus produces a more robust and powerful
cluster core for each class to facilitate adversarial learning. Extensive
experiments on benchmark datasets, including DomainNet, Office-Home and Office,
demonstrate that our proposed approach achieves the state-of-the-art
performance in semi-supervised domain adaptation.
Related papers
- Inter-Domain Mixup for Semi-Supervised Domain Adaptation [108.40945109477886]
Semi-supervised domain adaptation (SSDA) aims to bridge source and target domain distributions, with a small number of target labels available.
Existing SSDA work fails to make full use of label information from both source and target domains for feature alignment across domains.
This paper presents a novel SSDA approach, Inter-domain Mixup with Neighborhood Expansion (IDMNE), to tackle this issue.
arXiv Detail & Related papers (2024-01-21T10:20:46Z) - Adaptive Betweenness Clustering for Semi-Supervised Domain Adaptation [108.40945109477886]
We propose a novel SSDA approach named Graph-based Adaptive Betweenness Clustering (G-ABC) for achieving categorical domain alignment.
Our method outperforms previous state-of-the-art SSDA approaches, demonstrating the superiority of the proposed G-ABC algorithm.
arXiv Detail & Related papers (2024-01-21T09:57:56Z) - Semi-supervised Domain Adaptation via Prototype-based Multi-level
Learning [4.232614032390374]
In semi-supervised domain adaptation (SSDA), a few labeled target samples of each class help the model to transfer knowledge representation from the fully labeled source domain to the target domain.
We propose a Prototype-based Multi-level Learning (ProML) framework to better tap the potential of labeled target samples.
arXiv Detail & Related papers (2023-05-04T10:09:30Z) - Domain-Generalizable Multiple-Domain Clustering [55.295300263404265]
This work generalizes the problem of unsupervised domain generalization to the case in which no labeled samples are available (completely unsupervised).
We are given unlabeled samples from multiple source domains, and we aim to learn a shared predictor that assigns examples to semantically related clusters.
Evaluation is done by predicting cluster assignments in previously unseen domains.
arXiv Detail & Related papers (2023-01-31T10:24:50Z) - Polycentric Clustering and Structural Regularization for Source-free
Unsupervised Domain Adaptation [20.952542421577487]
Source-Free Domain Adaptation (SFDA) aims to solve the domain adaptation problem by transferring the knowledge learned from a pre-trained source model to an unseen target domain.
Most existing methods assign pseudo-labels to the target data by generating feature prototypes.
In this paper, a novel framework named PCSR is proposed to tackle SFDA via a novel intra-class Polycentric Clustering and Structural Regularization strategy.
arXiv Detail & Related papers (2022-10-14T02:20:48Z) - Class-Balanced Pixel-Level Self-Labeling for Domain Adaptive Semantic
Segmentation [31.50802009879241]
Domain adaptive semantic segmentation aims to learn a model with the supervision of source domain data, and produce dense predictions on unlabeled target domain.
One popular solution to this challenging task is self-training, which selects high-scoring predictions on target samples as pseudo labels for training.
We propose to directly explore the intrinsic pixel distributions of target domain data, instead of heavily relying on the source domain.
arXiv Detail & Related papers (2022-03-18T04:56:20Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Cross-Domain Grouping and Alignment for Domain Adaptive Semantic
Segmentation [74.3349233035632]
Existing techniques to adapt semantic segmentation networks across the source and target domains within deep convolutional neural networks (CNNs) do not consider an inter-class variation within the target domain itself or estimated category.
We introduce a learnable clustering module, and a novel domain adaptation framework called cross-domain grouping and alignment.
Our method consistently boosts the adaptation performance in semantic segmentation, outperforming the state-of-the-arts on various domain adaptation settings.
arXiv Detail & Related papers (2020-12-15T11:36:21Z) - Inductive Unsupervised Domain Adaptation for Few-Shot Classification via
Clustering [16.39667909141402]
Few-shot classification tends to struggle when it needs to adapt to diverse domains.
We introduce a framework, DaFeC, to improve Domain adaptation performance for Few-shot classification via Clustering.
Our approach outperforms previous work with absolute gains (in classification accuracy) of 4.95%, 9.55%, 3.99% and 11.62%, respectively.
arXiv Detail & Related papers (2020-06-23T08:17:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.