Tightening Classification Boundaries in Open Set Domain Adaptation
through Unknown Exploitation
- URL: http://arxiv.org/abs/2309.08964v1
- Date: Sat, 16 Sep 2023 11:33:40 GMT
- Title: Tightening Classification Boundaries in Open Set Domain Adaptation
through Unknown Exploitation
- Authors: Lucas Fernando Alvarenga e Silva, Nicu Sebe, Jurandy Almeida
- Abstract summary: Convolutional Neural Networks (CNNs) have brought revolutionary advances to many research areas.
But when those methods are applied to non-controllable environments, many different factors can degrade the model's expected performance.
We propose a novel way to improve OSDA approaches by extracting a high-confidence set of unknown instances.
- Score: 45.74830585715129
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convolutional Neural Networks (CNNs) have brought revolutionary advances to
many research areas due to their capacity of learning from raw data. However,
when those methods are applied to non-controllable environments, many different
factors can degrade the model's expected performance, such as unlabeled
datasets with different levels of domain shift and category shift.
Particularly, when both issues occur at the same time, we tackle this
challenging setup as Open Set Domain Adaptation (OSDA) problem. In general,
existing OSDA approaches focus their efforts only on aligning known classes or,
if they already extract possible negative instances, use them as a new category
learned with supervision during the course of training. We propose a novel way
to improve OSDA approaches by extracting a high-confidence set of unknown
instances and using it as a hard constraint to tighten the classification
boundaries of OSDA methods. Especially, we adopt a new loss constraint
evaluated in three different means, (1) directly with the pristine negative
instances; (2) with randomly transformed negatives using data augmentation
techniques; and (3) with synthetically generated negatives containing
adversarial features. We assessed all approaches in an extensive set of
experiments based on OVANet, where we could observe consistent improvements for
two public benchmarks, the Office-31 and Office-Home datasets, yielding
absolute gains of up to 1.3% for both Accuracy and H-Score on Office-31 and
5.8% for Accuracy and 4.7% for H-Score on Office-Home.
Related papers
- Beyond the Known: Enhancing Open Set Domain Adaptation with Unknown Exploration [40.2428948628001]
We introduce a new approach to improve OSDA techniques by extracting a set of high-confidence unknown instances.
We were able to achieve similar H-score to other state-of-the-art methods, while increasing the accuracy on unknown categories.
arXiv Detail & Related papers (2024-12-24T02:27:35Z) - Upcycling Models under Domain and Category Shift [95.22147885947732]
We introduce an innovative global and local clustering learning technique (GLC)
We design a novel, adaptive one-vs-all global clustering algorithm to achieve the distinction across different target classes.
Remarkably, in the most challenging open-partial-set DA scenario, GLC outperforms UMAD by 14.8% on the VisDA benchmark.
arXiv Detail & Related papers (2023-03-13T13:44:04Z) - Imbalanced Open Set Domain Adaptation via Moving-threshold Estimation
and Gradual Alignment [58.56087979262192]
Open Set Domain Adaptation (OSDA) aims to transfer knowledge from a well-labeled source domain to an unlabeled target domain.
The performance of OSDA methods degrades drastically under intra-domain class imbalance and inter-domain label shift.
We propose Open-set Moving-threshold Estimation and Gradual Alignment (OMEGA) to alleviate the negative effects raised by label shift.
arXiv Detail & Related papers (2023-03-08T05:55:02Z) - Divide and Contrast: Source-free Domain Adaptation via Adaptive
Contrastive Learning [122.62311703151215]
Divide and Contrast (DaC) aims to connect the good ends of both worlds while bypassing their limitations.
DaC divides the target data into source-like and target-specific samples, where either group of samples is treated with tailored goals.
We further align the source-like domain with the target-specific samples using a memory bank-based Maximum Mean Discrepancy (MMD) loss to reduce the distribution mismatch.
arXiv Detail & Related papers (2022-11-12T09:21:49Z) - Open Set Domain Adaptation By Novel Class Discovery [118.25447367755737]
In Open Set Domain Adaptation (OSDA), large amounts of target samples are drawn from the implicit categories that never appear in the source domain.
We propose Self-supervised Class-Discovering Adapter that attempts to achieve OSDA by gradually discovering those implicit classes.
arXiv Detail & Related papers (2022-03-07T12:16:46Z) - Unsupervised domain adaptation with non-stochastic missing data [0.6608945629704323]
We consider unsupervised domain adaptation (UDA) for classification problems in the presence of missing data in the unlabelled target domain.
Imputation is performed in a domain-invariant latent space and leverages indirect supervision from a complete source domain.
We show the benefits of jointly performing adaptation, classification and imputation on datasets.
arXiv Detail & Related papers (2021-09-16T06:37:07Z) - Partially-Shared Variational Auto-encoders for Unsupervised Domain
Adaptation with Target Shift [11.873435088539459]
This paper proposes a novel approach for unsupervised domain adaptation (UDA) with target shift.
The proposed method, partially shared variational autoencoders (PS-VAEs), uses pair-wise feature alignment instead of feature distribution matching.
PS-VAEs inter-convert domain of each sample by a CycleGAN-based architecture while preserving its label-related content.
arXiv Detail & Related papers (2020-01-22T06:41:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.