Tightening Classification Boundaries in Open Set Domain Adaptation
through Unknown Exploitation
- URL: http://arxiv.org/abs/2309.08964v1
- Date: Sat, 16 Sep 2023 11:33:40 GMT
- Title: Tightening Classification Boundaries in Open Set Domain Adaptation
through Unknown Exploitation
- Authors: Lucas Fernando Alvarenga e Silva, Nicu Sebe, Jurandy Almeida
- Abstract summary: Convolutional Neural Networks (CNNs) have brought revolutionary advances to many research areas.
But when those methods are applied to non-controllable environments, many different factors can degrade the model's expected performance.
We propose a novel way to improve OSDA approaches by extracting a high-confidence set of unknown instances.
- Score: 45.74830585715129
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional Neural Networks (CNNs) have brought revolutionary advances to
many research areas due to their capacity of learning from raw data. However,
when those methods are applied to non-controllable environments, many different
factors can degrade the model's expected performance, such as unlabeled
datasets with different levels of domain shift and category shift.
Particularly, when both issues occur at the same time, we tackle this
challenging setup as Open Set Domain Adaptation (OSDA) problem. In general,
existing OSDA approaches focus their efforts only on aligning known classes or,
if they already extract possible negative instances, use them as a new category
learned with supervision during the course of training. We propose a novel way
to improve OSDA approaches by extracting a high-confidence set of unknown
instances and using it as a hard constraint to tighten the classification
boundaries of OSDA methods. Especially, we adopt a new loss constraint
evaluated in three different means, (1) directly with the pristine negative
instances; (2) with randomly transformed negatives using data augmentation
techniques; and (3) with synthetically generated negatives containing
adversarial features. We assessed all approaches in an extensive set of
experiments based on OVANet, where we could observe consistent improvements for
two public benchmarks, the Office-31 and Office-Home datasets, yielding
absolute gains of up to 1.3% for both Accuracy and H-Score on Office-31 and
5.8% for Accuracy and 4.7% for H-Score on Office-Home.
Related papers
- A Dataset for Semantic Segmentation in the Presence of Unknowns [49.795683850385956]
Existing datasets allow evaluation of only knowns or unknowns - but not both.
We propose a novel anomaly segmentation dataset, ISSU, that features a diverse set of anomaly inputs from cluttered real-world environments.
The dataset is twice larger than existing anomaly segmentation datasets.
arXiv Detail & Related papers (2025-03-28T10:31:01Z) - RoCA: Robust Contrastive One-class Time Series Anomaly Detection with Contaminated Data [19.25420308920505]
Methods based on normality assumptions face three limitations.
Their basic assumption is that the training data is uncontaminated (free of anomalies)
This paper proposes a novel robust approach, RoCA, which is the first to address all of the above three challenges.
arXiv Detail & Related papers (2025-03-24T06:52:28Z) - Beyond the Known: Enhancing Open Set Domain Adaptation with Unknown Exploration [40.2428948628001]
We introduce a new approach to improve OSDA techniques by extracting a set of high-confidence unknown instances.
We were able to achieve similar H-score to other state-of-the-art methods, while increasing the accuracy on unknown categories.
arXiv Detail & Related papers (2024-12-24T02:27:35Z) - Taxonomy Adaptive Cross-Domain Adaptation in Medical Imaging via
Optimization Trajectory Distillation [73.83178465971552]
The success of automated medical image analysis depends on large-scale and expert-annotated training sets.
Unsupervised domain adaptation (UDA) has been raised as a promising approach to alleviate the burden of labeled data collection.
We propose optimization trajectory distillation, a unified approach to address the two technical challenges from a new perspective.
arXiv Detail & Related papers (2023-07-27T08:58:05Z) - Upcycling Models under Domain and Category Shift [95.22147885947732]
We introduce an innovative global and local clustering learning technique (GLC)
We design a novel, adaptive one-vs-all global clustering algorithm to achieve the distinction across different target classes.
Remarkably, in the most challenging open-partial-set DA scenario, GLC outperforms UMAD by 14.8% on the VisDA benchmark.
arXiv Detail & Related papers (2023-03-13T13:44:04Z) - Imbalanced Open Set Domain Adaptation via Moving-threshold Estimation
and Gradual Alignment [58.56087979262192]
Open Set Domain Adaptation (OSDA) aims to transfer knowledge from a well-labeled source domain to an unlabeled target domain.
The performance of OSDA methods degrades drastically under intra-domain class imbalance and inter-domain label shift.
We propose Open-set Moving-threshold Estimation and Gradual Alignment (OMEGA) to alleviate the negative effects raised by label shift.
arXiv Detail & Related papers (2023-03-08T05:55:02Z) - Divide and Contrast: Source-free Domain Adaptation via Adaptive
Contrastive Learning [122.62311703151215]
Divide and Contrast (DaC) aims to connect the good ends of both worlds while bypassing their limitations.
DaC divides the target data into source-like and target-specific samples, where either group of samples is treated with tailored goals.
We further align the source-like domain with the target-specific samples using a memory bank-based Maximum Mean Discrepancy (MMD) loss to reduce the distribution mismatch.
arXiv Detail & Related papers (2022-11-12T09:21:49Z) - Open Set Domain Adaptation By Novel Class Discovery [118.25447367755737]
In Open Set Domain Adaptation (OSDA), large amounts of target samples are drawn from the implicit categories that never appear in the source domain.
We propose Self-supervised Class-Discovering Adapter that attempts to achieve OSDA by gradually discovering those implicit classes.
arXiv Detail & Related papers (2022-03-07T12:16:46Z) - E-ADDA: Unsupervised Adversarial Domain Adaptation Enhanced by a New
Mahalanobis Distance Loss for Smart Computing [25.510639595356597]
In smart computing, the labels of training samples for a specific task are not always abundant.
We propose a novel UDA algorithm, textitE-ADDA, which uses both a novel variation of the Mahalanobis distance loss and an out-of-distribution detection subroutine.
In the acoustic modality, E-ADDA outperforms several state-of-the-art UDA algorithms by up to 29.8%, measured in the f1 score.
In the computer vision modality, the evaluation results suggest that we achieve new state-of-the-art performance on popular UDA
arXiv Detail & Related papers (2022-01-24T23:20:55Z) - Unsupervised domain adaptation with non-stochastic missing data [0.6608945629704323]
We consider unsupervised domain adaptation (UDA) for classification problems in the presence of missing data in the unlabelled target domain.
Imputation is performed in a domain-invariant latent space and leverages indirect supervision from a complete source domain.
We show the benefits of jointly performing adaptation, classification and imputation on datasets.
arXiv Detail & Related papers (2021-09-16T06:37:07Z) - Partially-Shared Variational Auto-encoders for Unsupervised Domain
Adaptation with Target Shift [11.873435088539459]
This paper proposes a novel approach for unsupervised domain adaptation (UDA) with target shift.
The proposed method, partially shared variational autoencoders (PS-VAEs), uses pair-wise feature alignment instead of feature distribution matching.
PS-VAEs inter-convert domain of each sample by a CycleGAN-based architecture while preserving its label-related content.
arXiv Detail & Related papers (2020-01-22T06:41:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.