Positive-Unlabeled Domain Adaptation
- URL: http://arxiv.org/abs/2202.05695v1
- Date: Fri, 11 Feb 2022 15:32:02 GMT
- Title: Positive-Unlabeled Domain Adaptation
- Authors: Jonas Sonntag, Gunnar Behrens, Lars Schmidt-Thieme
- Abstract summary: We present a novel two-step learning approach to the problem of Positive-Unlabeled Domain Adaptation.
We identify reliable positive and negative pseudo-labels in the target domain guided by source domain labels and a positive-unlabeled risk estimator.
We validate our approach by running experiments on benchmark datasets for visual object recognition.
- Score: 7.143879014059893
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Domain Adaptation methodologies have shown to effectively generalize from a
labeled source domain to a label scarce target domain. Previous research has
either focused on unlabeled domain adaptation without any target supervision or
semi-supervised domain adaptation with few labeled target examples per class.
On the other hand Positive-Unlabeled (PU-) Learning has attracted increasing
interest in the weakly supervised learning literature since in quite some real
world applications positive labels are much easier to obtain than negative
ones. In this work we are the first to introduce the challenge of
Positive-Unlabeled Domain Adaptation where we aim to generalise from a fully
labeled source domain to a target domain where only positive and unlabeled data
is available. We present a novel two-step learning approach to this problem by
firstly identifying reliable positive and negative pseudo-labels in the target
domain guided by source domain labels and a positive-unlabeled risk estimator.
This enables us to use a standard classifier on the target domain in a second
step. We validate our approach by running experiments on benchmark datasets for
visual object recognition. Furthermore we propose real world examples for our
setting and validate our superior performance on parking occupancy data.
Related papers
- Combating Label Distribution Shift for Active Domain Adaptation [16.270897459117755]
We consider the problem of active domain adaptation (ADA) to unlabeled target data.
Inspired by recent analysis on a critical issue from label distribution mismatch between source and target in domain adaptation, we devise a method that addresses the issue for the first time in ADA.
arXiv Detail & Related papers (2022-08-13T09:06:45Z) - Cycle Label-Consistent Networks for Unsupervised Domain Adaptation [57.29464116557734]
Domain adaptation aims to leverage a labeled source domain to learn a classifier for the unlabeled target domain with a different distribution.
We propose a simple yet efficient domain adaptation method, i.e. Cycle Label-Consistent Network (CLCN), by exploiting the cycle consistency of classification label.
We demonstrate the effectiveness of our approach on MNIST-USPS-SVHN, Office-31, Office-Home and Image CLEF-DA benchmarks.
arXiv Detail & Related papers (2022-05-27T13:09:08Z) - CA-UDA: Class-Aware Unsupervised Domain Adaptation with Optimal
Assignment and Pseudo-Label Refinement [84.10513481953583]
unsupervised domain adaptation (UDA) focuses on the selection of good pseudo-labels as surrogates for the missing labels in the target data.
source domain bias that deteriorates the pseudo-labels can still exist since the shared network of the source and target domains are typically used for the pseudo-label selections.
We propose CA-UDA to improve the quality of the pseudo-labels and UDA results with optimal assignment, a pseudo-label refinement strategy and class-aware domain alignment.
arXiv Detail & Related papers (2022-05-26T18:45:04Z) - Domain Adaptive Semantic Segmentation without Source Data [50.18389578589789]
We investigate domain adaptive semantic segmentation without source data, which assumes that the model is pre-trained on the source domain.
We propose an effective framework for this challenging problem with two components: positive learning and negative learning.
Our framework can be easily implemented and incorporated with other methods to further enhance the performance.
arXiv Detail & Related papers (2021-10-13T04:12:27Z) - Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation [85.6961770631173]
In semi-supervised domain adaptation, a few labeled samples per class in the target domain guide features of the remaining target samples to aggregate around them.
We propose a novel approach called Cross-domain Adaptive Clustering to address this problem.
arXiv Detail & Related papers (2021-04-19T16:07:32Z) - Effective Label Propagation for Discriminative Semi-Supervised Domain
Adaptation [76.41664929948607]
Semi-supervised domain adaptation (SSDA) methods have demonstrated great potential in large-scale image classification tasks.
We present a novel and effective method to tackle this problem by using effective inter-domain and intra-domain semantic information propagation.
Our source code and pre-trained models will be released soon.
arXiv Detail & Related papers (2020-12-04T14:28:19Z) - Cross-domain Self-supervised Learning for Domain Adaptation with Few
Source Labels [78.95901454696158]
We propose a novel Cross-Domain Self-supervised learning approach for domain adaptation.
Our method significantly boosts performance of target accuracy in the new target domain with few source labels.
arXiv Detail & Related papers (2020-03-18T15:11:07Z) - A simple baseline for domain adaptation using rotation prediction [17.539027866457673]
The goal is to adapt a model trained in one domain to another domain with scarce annotated data.
We propose a simple yet effective method based on self-supervised learning.
Our simple method achieves state-of-the-art results on semi-supervised domain adaptation on DomainNet dataset.
arXiv Detail & Related papers (2019-12-26T17:32:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.