Exploring Category-Agnostic Clusters for Open-Set Domain Adaptation
- URL: http://arxiv.org/abs/2006.06567v1
- Date: Thu, 11 Jun 2020 16:19:02 GMT
- Title: Exploring Category-Agnostic Clusters for Open-Set Domain Adaptation
- Authors: Yingwei Pan and Ting Yao and Yehao Li and Chong-Wah Ngo and Tao Mei
- Abstract summary: We present Self-Ensembling with Category-agnostic Clusters (SE-CC) -- a novel architecture that steers domain adaptation with category-agnostic clusters in target domain.
clustering is performed over all the unlabeled target samples to obtain the category-agnostic clusters, which reveal the underlying data space structure peculiar to target domain.
- Score: 138.29273453811945
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised domain adaptation has received significant attention in recent
years. Most of existing works tackle the closed-set scenario, assuming that the
source and target domains share the exactly same categories. In practice,
nevertheless, a target domain often contains samples of classes unseen in
source domain (i.e., unknown class). The extension of domain adaptation from
closed-set to such open-set situation is not trivial since the target samples
in unknown class are not expected to align with the source. In this paper, we
address this problem by augmenting the state-of-the-art domain adaptation
technique, Self-Ensembling, with category-agnostic clusters in target domain.
Specifically, we present Self-Ensembling with Category-agnostic Clusters
(SE-CC) -- a novel architecture that steers domain adaptation with the
additional guidance of category-agnostic clusters that are specific to target
domain. These clustering information provides domain-specific visual cues,
facilitating the generalization of Self-Ensembling for both closed-set and
open-set scenarios. Technically, clustering is firstly performed over all the
unlabeled target samples to obtain the category-agnostic clusters, which reveal
the underlying data space structure peculiar to target domain. A clustering
branch is capitalized on to ensure that the learnt representation preserves
such underlying structure by matching the estimated assignment distribution
over clusters to the inherent cluster distribution for each target sample.
Furthermore, SE-CC enhances the learnt representation with mutual information
maximization. Extensive experiments are conducted on Office and VisDA datasets
for both open-set and closed-set domain adaptation, and superior results are
reported when comparing to the state-of-the-art approaches.
Related papers
- Self-Paced Learning for Open-Set Domain Adaptation [50.620824701934]
Traditional domain adaptation methods presume that the classes in the source and target domains are identical.
Open-set domain adaptation (OSDA) addresses this limitation by allowing previously unseen classes in the target domain.
We propose a novel framework based on self-paced learning to distinguish common and unknown class samples.
arXiv Detail & Related papers (2023-03-10T14:11:09Z) - Unsupervised Domain Adaptation via Distilled Discriminative Clustering [45.39542287480395]
We re-cast the domain adaptation problem as discriminative clustering of target data.
We propose to jointly train the network using parallel, supervised learning objectives over labeled source data.
We conduct careful ablation studies and extensive experiments on five popular benchmark datasets.
arXiv Detail & Related papers (2023-02-23T13:03:48Z) - Polycentric Clustering and Structural Regularization for Source-free
Unsupervised Domain Adaptation [20.952542421577487]
Source-Free Domain Adaptation (SFDA) aims to solve the domain adaptation problem by transferring the knowledge learned from a pre-trained source model to an unseen target domain.
Most existing methods assign pseudo-labels to the target data by generating feature prototypes.
In this paper, a novel framework named PCSR is proposed to tackle SFDA via a novel intra-class Polycentric Clustering and Structural Regularization strategy.
arXiv Detail & Related papers (2022-10-14T02:20:48Z) - Adaptive Methods for Aggregated Domain Generalization [26.215904177457997]
In many settings, privacy concerns prohibit obtaining domain labels for the training data samples.
We propose a domain-adaptive approach to this problem, which operates in two steps.
Our approach achieves state-of-the-art performance on a variety of domain generalization benchmarks without using domain labels.
arXiv Detail & Related papers (2021-12-09T08:57:01Z) - Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation [85.6961770631173]
In semi-supervised domain adaptation, a few labeled samples per class in the target domain guide features of the remaining target samples to aggregate around them.
We propose a novel approach called Cross-domain Adaptive Clustering to address this problem.
arXiv Detail & Related papers (2021-04-19T16:07:32Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Towards Uncovering the Intrinsic Data Structures for Unsupervised Domain
Adaptation using Structurally Regularized Deep Clustering [119.88565565454378]
Unsupervised domain adaptation (UDA) is to learn classification models that make predictions for unlabeled data on a target domain.
We propose a hybrid model of Structurally Regularized Deep Clustering, which integrates the regularized discriminative clustering of target data with a generative one.
Our proposed H-SRDC outperforms all the existing methods under both the inductive and transductive settings.
arXiv Detail & Related papers (2020-12-08T08:52:00Z) - Universal Domain Adaptation through Self Supervision [75.04598763659969]
Unsupervised domain adaptation methods assume that all source categories are present in the target domain.
We propose Domain Adaptative Neighborhood Clustering via Entropy optimization (DANCE) to handle arbitrary category shift.
We show through extensive experiments that DANCE outperforms baselines across open-set, open-partial and partial domain adaptation settings.
arXiv Detail & Related papers (2020-02-19T01:26:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.