Local Context-Aware Active Domain Adaptation
- URL: http://arxiv.org/abs/2208.12856v3
- Date: Sun, 27 Aug 2023 16:46:38 GMT
- Title: Local Context-Aware Active Domain Adaptation
- Authors: Tao Sun, Cheng Lu, Haibin Ling
- Abstract summary: We propose a Local context-aware ADA framework, named LADA, to address this issue.
To select informative target samples, we devise a novel criterion based on the local inconsistency of model predictions.
Experiments validate that the proposed criterion chooses more informative target samples than existing active selection strategies.
- Score: 61.59201475369795
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Active Domain Adaptation (ADA) queries the labels of a small number of
selected target samples to help adapting a model from a source domain to a
target domain. The local context of queried data is important, especially when
the domain gap is large. However, this has not been fully explored by existing
ADA works. In this paper, we propose a Local context-aware ADA framework, named
LADA, to address this issue. To select informative target samples, we devise a
novel criterion based on the local inconsistency of model predictions. Since
the labeling budget is usually small, fine-tuning model on only queried data
can be inefficient. We progressively augment labeled target data with the
confident neighbors in a class-balanced manner. Experiments validate that the
proposed criterion chooses more informative target samples than existing active
selection strategies. Furthermore, our full method clearly surpasses recent ADA
arts on various benchmarks. Code is available at https://github.com/tsun/LADA.
Related papers
- SKADA-Bench: Benchmarking Unsupervised Domain Adaptation Methods with Realistic Validation [55.87169702896249]
Unsupervised Domain Adaptation (DA) consists of adapting a model trained on a labeled source domain to perform well on an unlabeled target domain with some data distribution shift.
We propose a framework to evaluate DA methods and present a fair evaluation of existing shallow algorithms, including reweighting, mapping, and subspace alignment.
Our benchmark highlights the importance of realistic validation and provides practical guidance for real-life applications.
arXiv Detail & Related papers (2024-07-16T12:52:29Z) - Revisiting the Domain Shift and Sample Uncertainty in Multi-source
Active Domain Transfer [69.82229895838577]
Active Domain Adaptation (ADA) aims to maximally boost model adaptation in a new target domain by actively selecting a limited number of target data to annotate.
This setting neglects the more practical scenario where training data are collected from multiple sources.
This motivates us to target a new and challenging setting of knowledge transfer that extends ADA from a single source domain to multiple source domains.
arXiv Detail & Related papers (2023-11-21T13:12:21Z) - Divide and Adapt: Active Domain Adaptation via Customized Learning [56.79144758380419]
We present Divide-and-Adapt (DiaNA), a new ADA framework that partitions the target instances into four categories with stratified transferable properties.
With a novel data subdivision protocol based on uncertainty and domainness, DiaNA can accurately recognize the most gainful samples.
Thanks to the "divideand-adapt" spirit, DiaNA can handle data with large variations of domain gap.
arXiv Detail & Related papers (2023-07-21T14:37:17Z) - Combating Label Distribution Shift for Active Domain Adaptation [16.270897459117755]
We consider the problem of active domain adaptation (ADA) to unlabeled target data.
Inspired by recent analysis on a critical issue from label distribution mismatch between source and target in domain adaptation, we devise a method that addresses the issue for the first time in ADA.
arXiv Detail & Related papers (2022-08-13T09:06:45Z) - Loss-based Sequential Learning for Active Domain Adaptation [14.366263836801485]
This paper introduces sequential learning considering both domain type (source/target) or labelness (labeled/unlabeled)
Our model significantly outperforms previous methods as well as baseline models in various benchmark datasets.
arXiv Detail & Related papers (2022-04-25T14:00:04Z) - S$^3$VAADA: Submodular Subset Selection for Virtual Adversarial Active
Domain Adaptation [49.01925978752677]
In the real-world scenario's it might be feasible to get labels for a small proportion of target data.
We propose S$3$VAADA which i) introduces a novel submodular criterion to select a maximally informative subset to label and ii) enhances a cluster-based DA procedure.
Our approach consistently outperforms the competing state-of-the-art approaches on datasets with varying degrees of domain shifts.
arXiv Detail & Related papers (2021-09-18T10:53:57Z) - OVANet: One-vs-All Network for Universal Domain Adaptation [78.86047802107025]
Existing methods manually set a threshold to reject unknown samples based on validation or a pre-defined ratio of unknown samples.
We propose a method to learn the threshold using source samples and to adapt it to the target domain.
Our idea is that a minimum inter-class distance in the source domain should be a good threshold to decide between known or unknown in the target.
arXiv Detail & Related papers (2021-04-07T18:36:31Z) - Test-time Unsupervised Domain Adaptation [3.4188171733930584]
Convolutional neural networks rarely generalise to different scanners or acquisition protocols (target domain)
We show that models adapted to a specific target subject from the target domain outperform a domain adaptation method which has seen more data of the target domain but not this specific target subject.
arXiv Detail & Related papers (2020-10-05T11:30:36Z) - Enlarging Discriminative Power by Adding an Extra Class in Unsupervised
Domain Adaptation [5.377369521932011]
We propose an idea of empowering the discriminativeness: Adding a new, artificial class and training the model on the data together with the GAN-generated samples of the new class.
Our idea is highly generic so that it is compatible with many existing methods such as DANN, VADA, and DIRT-T.
arXiv Detail & Related papers (2020-02-19T07:58:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.