Distance-based Hyperspherical Classification for Multi-source Open-Set
Domain Adaptation
- URL: http://arxiv.org/abs/2107.02067v1
- Date: Mon, 5 Jul 2021 14:56:57 GMT
- Title: Distance-based Hyperspherical Classification for Multi-source Open-Set
Domain Adaptation
- Authors: Silvia Bucci, Francesco Cappio Borlino, Barbara Caputo, Tatiana
Tommasi
- Abstract summary: Vision systems trained in closed-world scenarios will inevitably fail when presented with new environmental conditions.
How to move towards open-world learning is a long standing research question.
In this work we tackle multi-source Open-Set domain adaptation by introducing HyMOS.
- Score: 34.97934677830779
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Vision systems trained in closed-world scenarios will inevitably fail when
presented with new environmental conditions, new data distributions and novel
classes at deployment time. How to move towards open-world learning is a long
standing research question, but the existing solutions mainly focus on specific
aspects of the problem (single domain Open-Set, multi-domain Closed-Set), or
propose complex strategies which combine multiple losses and manually tuned
hyperparameters. In this work we tackle multi-source Open-Set domain adaptation
by introducing HyMOS: a straightforward supervised model that exploits the
power of contrastive learning and the properties of its hyperspherical feature
space to correctly predict known labels on the target, while rejecting samples
belonging to any unknown class. HyMOS includes a tailored data balancing to
enforce cross-source alignment and introduces style transfer among the instance
transformations of contrastive learning for source-target adaptation, avoiding
the risk of negative transfer. Finally a self-training strategy refines the
model without the need for handcrafted thresholds. We validate our method over
three challenging datasets and provide an extensive quantitative and
qualitative experimental analysis. The obtained results show that HyMOS
outperforms several Open-Set and universal domain adaptation approaches,
defining the new state-of-the-art.
Related papers
- Uncertainty-guided Open-Set Source-Free Unsupervised Domain Adaptation with Target-private Class Segregation [22.474866164542302]
UDA approaches commonly assume that source and target domains share the same labels space.
This paper considers the more challenging Source-Free Open-set Domain Adaptation (SF-OSDA) setting.
We propose a novel approach for SF-OSDA that exploits the granularity of target-private categories by segregating their samples into multiple unknown classes.
arXiv Detail & Related papers (2024-04-16T13:52:00Z) - Unified Source-Free Domain Adaptation [44.95240684589647]
In pursuit of transferring a source model to a target domain without access to the source training data, Source-Free Domain Adaptation (SFDA) has been extensively explored.
We propose a novel approach called Latent Causal Factors Discovery (LCFD)
In contrast to previous alternatives that emphasize learning the statistical description of reality, we formulate LCFD from a causality perspective.
arXiv Detail & Related papers (2024-03-12T12:40:08Z) - DiffClass: Diffusion-Based Class Incremental Learning [30.514281721324853]
Class Incremental Learning (CIL) is challenging due to catastrophic forgetting.
Recent exemplar-free CIL methods attempt to mitigate catastrophic forgetting by synthesizing previous task data.
We propose a novel exemplar-free CIL method to overcome these issues.
arXiv Detail & Related papers (2024-03-08T03:34:18Z) - Tackling Diverse Minorities in Imbalanced Classification [80.78227787608714]
Imbalanced datasets are commonly observed in various real-world applications, presenting significant challenges in training classifiers.
We propose generating synthetic samples iteratively by mixing data samples from both minority and majority classes.
We demonstrate the effectiveness of our proposed framework through extensive experiments conducted on seven publicly available benchmark datasets.
arXiv Detail & Related papers (2023-08-28T18:48:34Z) - LatentDR: Improving Model Generalization Through Sample-Aware Latent
Degradation and Restoration [22.871920291497094]
We propose a novel approach for distribution-aware latent augmentation.
Our approach first degrades the samples in the latent space, mapping them to augmented labels, and then restores the samples during training.
We show that our method can be flexibly adapted to long-tail recognition tasks, demonstrating its versatility in building more generalizable models.
arXiv Detail & Related papers (2023-08-28T14:08:42Z) - Consistency Regularization for Generalizable Source-free Domain
Adaptation [62.654883736925456]
Source-free domain adaptation (SFDA) aims to adapt a well-trained source model to an unlabelled target domain without accessing the source dataset.
Existing SFDA methods ONLY assess their adapted models on the target training set, neglecting the data from unseen but identically distributed testing sets.
We propose a consistency regularization framework to develop a more generalizable SFDA method.
arXiv Detail & Related papers (2023-08-03T07:45:53Z) - Learning Invariant Representations and Risks for Semi-supervised Domain
Adaptation [109.73983088432364]
We propose the first method that aims to simultaneously learn invariant representations and risks under the setting of semi-supervised domain adaptation (Semi-DA)
We introduce the LIRR algorithm for jointly textbfLearning textbfInvariant textbfRepresentations and textbfRisks.
arXiv Detail & Related papers (2020-10-09T15:42:35Z) - A Review of Single-Source Deep Unsupervised Visual Domain Adaptation [81.07994783143533]
Large-scale labeled training datasets have enabled deep neural networks to excel across a wide range of benchmark vision tasks.
In many applications, it is prohibitively expensive and time-consuming to obtain large quantities of labeled data.
To cope with limited labeled training data, many have attempted to directly apply models trained on a large-scale labeled source domain to another sparsely labeled or unlabeled target domain.
arXiv Detail & Related papers (2020-09-01T00:06:50Z) - Universal Source-Free Domain Adaptation [57.37520645827318]
We propose a novel two-stage learning process for domain adaptation.
In the Procurement stage, we aim to equip the model for future source-free deployment, assuming no prior knowledge of the upcoming category-gap and domain-shift.
In the Deployment stage, the goal is to design a unified adaptation algorithm capable of operating across a wide range of category-gaps.
arXiv Detail & Related papers (2020-04-09T07:26:20Z) - Unsupervised Domain Adaptation in Person re-ID via k-Reciprocal
Clustering and Large-Scale Heterogeneous Environment Synthesis [76.46004354572956]
We introduce an unsupervised domain adaptation approach for person re-identification.
Experimental results show that the proposed ktCUDA and SHRED approach achieves an average improvement of +5.7 mAP in re-identification performance.
arXiv Detail & Related papers (2020-01-14T17:43:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.