Open Set Domain Adaptation By Novel Class Discovery
- URL: http://arxiv.org/abs/2203.03329v1
- Date: Mon, 7 Mar 2022 12:16:46 GMT
- Title: Open Set Domain Adaptation By Novel Class Discovery
- Authors: Jingyu Zhuang, Ziliang Chen, Pengxu Wei, Guanbin Li, Liang Lin
- Abstract summary: In Open Set Domain Adaptation (OSDA), large amounts of target samples are drawn from the implicit categories that never appear in the source domain.
We propose Self-supervised Class-Discovering Adapter that attempts to achieve OSDA by gradually discovering those implicit classes.
- Score: 118.25447367755737
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In Open Set Domain Adaptation (OSDA), large amounts of target samples are
drawn from the implicit categories that never appear in the source domain. Due
to the lack of their specific belonging, existing methods indiscriminately
regard them as a single class unknown. We challenge this broadly-adopted
practice that may arouse unexpected detrimental effects because the decision
boundaries between the implicit categories have been fully ignored. Instead, we
propose Self-supervised Class-Discovering Adapter (SCDA) that attempts to
achieve OSDA by gradually discovering those implicit classes, then
incorporating them to restructure the classifier and update the domain-adaptive
features iteratively. SCDA performs two alternate steps to achieve implicit
class discovery and self-supervised OSDA, respectively. By jointly optimizing
for two tasks, SCDA achieves the state-of-the-art in OSDA and shows a
competitive performance to unearth the implicit target classes.
Related papers
- Recall and Refine: A Simple but Effective Source-free Open-set Domain Adaptation Framework [9.03028904066824]
Open-set Domain Adaptation (OSDA) aims to adapt a model from a labeled source domain to an unlabeled target domain.
We propose Recall and Refine (RRDA), a novel SF-OSDA framework designed to address limitations by explicitly learning features for target-private unknown classes.
arXiv Detail & Related papers (2024-11-19T15:18:50Z) - Upcycling Models under Domain and Category Shift [95.22147885947732]
We introduce an innovative global and local clustering learning technique (GLC)
We design a novel, adaptive one-vs-all global clustering algorithm to achieve the distinction across different target classes.
Remarkably, in the most challenging open-partial-set DA scenario, GLC outperforms UMAD by 14.8% on the VisDA benchmark.
arXiv Detail & Related papers (2023-03-13T13:44:04Z) - Imbalanced Open Set Domain Adaptation via Moving-threshold Estimation
and Gradual Alignment [58.56087979262192]
Open Set Domain Adaptation (OSDA) aims to transfer knowledge from a well-labeled source domain to an unlabeled target domain.
The performance of OSDA methods degrades drastically under intra-domain class imbalance and inter-domain label shift.
We propose Open-set Moving-threshold Estimation and Gradual Alignment (OMEGA) to alleviate the negative effects raised by label shift.
arXiv Detail & Related papers (2023-03-08T05:55:02Z) - Shuffle Augmentation of Features from Unlabeled Data for Unsupervised
Domain Adaptation [21.497019000131917]
Unsupervised Domain Adaptation (UDA) is a branch of transfer learning where labels for target samples are unavailable.
In this paper, we propose Shuffle Augmentation of Features (SAF) as a novel UDA framework.
SAF learns from the target samples, adaptively distills class-aware target features, and implicitly guides the classifier to find comprehensive class borders.
arXiv Detail & Related papers (2022-01-28T07:11:05Z) - UMAD: Universal Model Adaptation under Domain and Category Shift [138.12678159620248]
Universal Model ADaptation (UMAD) framework handles both UDA scenarios without access to source data.
We develop an informative consistency score to help distinguish unknown samples from known samples.
Experiments on open-set and open-partial-set UDA scenarios demonstrate that UMAD exhibits comparable, if not superior, performance to state-of-the-art data-dependent methods.
arXiv Detail & Related papers (2021-12-16T01:22:59Z) - Generalized Domain Adaptation [16.36451405054308]
We give a general representation of UDA problems, named Generalized Domain Adaptation (GDA)
GDA covers the major variants as special cases, which allows us to organize them in a comprehensive framework.
We propose a novel approach to the new setting, which is self-supervised class-destructive learning.
arXiv Detail & Related papers (2021-06-03T07:55:18Z) - Casting a BAIT for Offline and Online Source-free Domain Adaptation [51.161476418834766]
We address the source-free domain adaptation (SFDA) problem, where only the source model is available during adaptation to the target domain.
Inspired by diverse classifier based domain adaptation methods, in this paper we introduce a second classifier.
When adapting to the target domain, the additional classifier from source is expected to find misclassified features.
Our method surpasses by a large margin other SFDA methods under online source-free domain adaptation setting.
arXiv Detail & Related papers (2020-10-23T14:18:42Z) - Class-Incremental Domain Adaptation [56.72064953133832]
We introduce a practical Domain Adaptation (DA) paradigm called Class-Incremental Domain Adaptation (CIDA)
Existing DA methods tackle domain-shift but are unsuitable for learning novel target-domain classes.
Our approach yields superior performance as compared to both DA and CI methods in the CIDA paradigm.
arXiv Detail & Related papers (2020-08-04T07:55:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.