Domain Adaptive Few-Shot Open-Set Learning
- URL: http://arxiv.org/abs/2309.12814v1
- Date: Fri, 22 Sep 2023 12:04:47 GMT
- Title: Domain Adaptive Few-Shot Open-Set Learning
- Authors: Debabrata Pal, Deeptej More, Sai Bhargav, Dipesh Tamboli, Vaneet
Aggarwal, Biplab Banerjee
- Abstract summary: We propose Domain Adaptive Few-Shot Open Set Recognition (DA-FSOS) and introduce a meta-learning-based architecture named DAFOSNET.
Our training approach ensures that DAFOS-NET can generalize well to new scenarios in the target domain.
We present three benchmarks for DA-FSOS based on the Office-Home, mini-ImageNet/CUB, and DomainNet datasets.
- Score: 36.39622440120531
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Few-shot learning has made impressive strides in addressing the crucial
challenges of recognizing unknown samples from novel classes in target query
sets and managing visual shifts between domains. However, existing techniques
fall short when it comes to identifying target outliers under domain shifts by
learning to reject pseudo-outliers from the source domain, resulting in an
incomplete solution to both problems. To address these challenges
comprehensively, we propose a novel approach called Domain Adaptive Few-Shot
Open Set Recognition (DA-FSOS) and introduce a meta-learning-based architecture
named DAFOSNET. During training, our model learns a shared and discriminative
embedding space while creating a pseudo open-space decision boundary, given a
fully-supervised source domain and a label-disjoint few-shot target domain. To
enhance data density, we use a pair of conditional adversarial networks with
tunable noise variances to augment both domains closed and pseudo-open spaces.
Furthermore, we propose a domain-specific batch-normalized class prototypes
alignment strategy to align both domains globally while ensuring
class-discriminativeness through novel metric objectives. Our training approach
ensures that DAFOS-NET can generalize well to new scenarios in the target
domain. We present three benchmarks for DA-FSOS based on the Office-Home,
mini-ImageNet/CUB, and DomainNet datasets and demonstrate the efficacy of
DAFOS-NET through extensive experimentation
Related papers
- Self-Paced Learning for Open-Set Domain Adaptation [50.620824701934]
Traditional domain adaptation methods presume that the classes in the source and target domains are identical.
Open-set domain adaptation (OSDA) addresses this limitation by allowing previously unseen classes in the target domain.
We propose a novel framework based on self-paced learning to distinguish common and unknown class samples.
arXiv Detail & Related papers (2023-03-10T14:11:09Z) - Few-Shot Object Detection in Unseen Domains [4.36080478413575]
Few-shot object detection (FSOD) has thrived in recent years to learn novel object classes with limited data.
We propose various data augmentations techniques on the few shots of novel classes to account for all possible domain-specific information.
Our experiments on the T-LESS dataset show that the proposed approach succeeds in alleviating the domain gap considerably.
arXiv Detail & Related papers (2022-04-11T13:16:41Z) - Structured Latent Embeddings for Recognizing Unseen Classes in Unseen
Domains [108.11746235308046]
We propose a novel approach that learns domain-agnostic structured latent embeddings by projecting images from different domains.
Our experiments on the challenging DomainNet and DomainNet-LS benchmarks show the superiority of our approach over existing methods.
arXiv Detail & Related papers (2021-07-12T17:57:46Z) - Cross-domain Contrastive Learning for Unsupervised Domain Adaptation [108.63914324182984]
Unsupervised domain adaptation (UDA) aims to transfer knowledge learned from a fully-labeled source domain to a different unlabeled target domain.
We build upon contrastive self-supervised learning to align features so as to reduce the domain discrepancy between training and testing sets.
arXiv Detail & Related papers (2021-06-10T06:32:30Z) - Towards Unsupervised Domain Adaptation for Deep Face Recognition under
Privacy Constraints via Federated Learning [33.33475702665153]
We propose a novel unsupervised federated face recognition approach (FedFR)
FedFR improves the performance in the target domain by iteratively aggregating knowledge from the source domain through federated learning.
It protects data privacy by transferring models instead of raw data between domains.
arXiv Detail & Related papers (2021-05-17T04:24:25Z) - Inferring Latent Domains for Unsupervised Deep Domain Adaptation [54.963823285456925]
Unsupervised Domain Adaptation (UDA) refers to the problem of learning a model in a target domain where labeled data are not available.
This paper introduces a novel deep architecture which addresses the problem of UDA by automatically discovering latent domains in visual datasets.
We evaluate our approach on publicly available benchmarks, showing that it outperforms state-of-the-art domain adaptation methods.
arXiv Detail & Related papers (2021-03-25T14:33:33Z) - Cluster, Split, Fuse, and Update: Meta-Learning for Open Compound Domain
Adaptive Semantic Segmentation [102.42638795864178]
We propose a principled meta-learning based approach to OCDA for semantic segmentation.
We cluster target domain into multiple sub-target domains by image styles, extracted in an unsupervised manner.
A meta-learner is thereafter deployed to learn to fuse sub-target domain-specific predictions, conditioned upon the style code.
We learn to online update the model by model-agnostic meta-learning (MAML) algorithm, thus to further improve generalization.
arXiv Detail & Related papers (2020-12-15T13:21:54Z) - Discrepancy Minimization in Domain Generalization with Generative
Nearest Neighbors [13.047289562445242]
Domain generalization (DG) deals with the problem of domain shift where a machine learning model trained on multiple-source domains fail to generalize well on a target domain with different statistics.
Multiple approaches have been proposed to solve the problem of domain generalization by learning domain invariant representations across the source domains that fail to guarantee generalization on the shifted target domain.
We propose a Generative Nearest Neighbor based Discrepancy Minimization (GNNDM) method which provides a theoretical guarantee that is upper bounded by the error in the labeling process of the target.
arXiv Detail & Related papers (2020-07-28T14:54:25Z) - Mind the Gap: Enlarging the Domain Gap in Open Set Domain Adaptation [65.38975706997088]
Open set domain adaptation (OSDA) assumes the presence of unknown classes in the target domain.
We show that existing state-of-the-art methods suffer a considerable performance drop in the presence of larger domain gaps.
We propose a novel framework to specifically address the larger domain gaps.
arXiv Detail & Related papers (2020-03-08T14:20:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.