Domain Adaptation from Scratch
- URL: http://arxiv.org/abs/2209.00830v1
- Date: Fri, 2 Sep 2022 05:55:09 GMT
- Title: Domain Adaptation from Scratch
- Authors: Eyal Ben-David, Yftah Ziser, Roi Reichart
- Abstract summary: We present a new learning setup, domain adaptation from scratch'', which we believe to be crucial for extending the reach of NLP to sensitive domains.
In this setup, we aim to efficiently annotate data from a set of source domains such that the trained model performs well on a sensitive target domain.
Our study compares several approaches for this challenging setup, ranging from data selection and domain adaptation algorithms to active learning paradigms.
- Score: 24.612696638386623
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Natural language processing (NLP) algorithms are rapidly improving but often
struggle when applied to out-of-distribution examples. A prominent approach to
mitigate the domain gap is domain adaptation, where a model trained on a source
domain is adapted to a new target domain. We present a new learning setup,
``domain adaptation from scratch'', which we believe to be crucial for
extending the reach of NLP to sensitive domains in a privacy-preserving manner.
In this setup, we aim to efficiently annotate data from a set of source domains
such that the trained model performs well on a sensitive target domain from
which data is unavailable for annotation. Our study compares several approaches
for this challenging setup, ranging from data selection and domain adaptation
algorithms to active learning paradigms, on two NLP tasks: sentiment analysis
and Named Entity Recognition. Our results suggest that using the abovementioned
approaches eases the domain gap, and combining them further improves the
results.
Related papers
- Stratified Domain Adaptation: A Progressive Self-Training Approach for Scene Text Recognition [1.2878987353423252]
Unsupervised domain adaptation (UDA) has become increasingly prevalent in scene text recognition (STR)
We introduce the Stratified Domain Adaptation (StrDA) approach, which examines the gradual escalation of the domain gap for the learning process.
We propose a novel method for employing domain discriminators to estimate the out-of-distribution and domain discriminative levels of data samples.
arXiv Detail & Related papers (2024-10-13T16:40:48Z) - Open-Set Domain Adaptation with Visual-Language Foundation Models [51.49854335102149]
Unsupervised domain adaptation (UDA) has proven to be very effective in transferring knowledge from a source domain to a target domain with unlabeled data.
Open-set domain adaptation (ODA) has emerged as a potential solution to identify these classes during the training phase.
arXiv Detail & Related papers (2023-07-30T11:38:46Z) - Meta-causal Learning for Single Domain Generalization [102.53303707563612]
Single domain generalization aims to learn a model from a single training domain (source domain) and apply it to multiple unseen test domains (target domains)
Existing methods focus on expanding the distribution of the training domain to cover the target domains, but without estimating the domain shift between the source and target domains.
We propose a new learning paradigm, namely simulate-analyze-reduce, which first simulates the domain shift by building an auxiliary domain as the target domain, then learns to analyze the causes of domain shift, and finally learns to reduce the domain shift for model adaptation.
arXiv Detail & Related papers (2023-04-07T15:46:38Z) - Stagewise Unsupervised Domain Adaptation with Adversarial Self-Training
for Road Segmentation of Remote Sensing Images [93.50240389540252]
Road segmentation from remote sensing images is a challenging task with wide ranges of application potentials.
We propose a novel stagewise domain adaptation model called RoadDA to address the domain shift (DS) issue in this field.
Experiment results on two benchmarks demonstrate that RoadDA can efficiently reduce the domain gap and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2021-08-28T09:29:14Z) - Neural Supervised Domain Adaptation by Augmenting Pre-trained Models
with Random Units [14.183224769428843]
Neural Transfer Learning (TL) is becoming ubiquitous in Natural Language Processing (NLP)
In this paper, we show through interpretation methods that such scheme, despite its efficiency, is suffering from a main limitation.
We propose to augment the pre-trained model with normalised, weighted and randomly initialised units that foster a better adaptation while maintaining the valuable source knowledge.
arXiv Detail & Related papers (2021-06-09T09:29:11Z) - Instance Level Affinity-Based Transfer for Unsupervised Domain
Adaptation [74.71931918541748]
We propose an instance affinity based criterion for source to target transfer during adaptation, called ILA-DA.
We first propose a reliable and efficient method to extract similar and dissimilar samples across source and target, and utilize a multi-sample contrastive loss to drive the domain alignment process.
We verify the effectiveness of ILA-DA by observing consistent improvements in accuracy over popular domain adaptation approaches on a variety of benchmark datasets.
arXiv Detail & Related papers (2021-04-03T01:33:14Z) - Domain Adaptation in LiDAR Semantic Segmentation by Aligning Class
Distributions [9.581605678437032]
This work addresses the problem of unsupervised domain adaptation for LiDAR semantic segmentation models.
Our approach combines novel ideas on top of the current state-of-the-art approaches and yields new state-of-the-art results.
arXiv Detail & Related papers (2020-10-23T08:52:15Z) - Sequential Domain Adaptation through Elastic Weight Consolidation for
Sentiment Analysis [3.1473798197405944]
We propose a model-independent framework - Sequential Domain Adaptation (SDA)
Our experiments show that the proposed framework enables simple architectures such as CNNs to outperform complex state-of-the-art models in domain adaptation of sentiment analysis (SA)
In addition, we observe that the effectiveness of a harder first Anti-Curriculum ordering of source domains leads to maximum performance.
arXiv Detail & Related papers (2020-07-02T15:21:56Z) - Domain Adaptation for Semantic Parsing [68.81787666086554]
We propose a novel semantic for domain adaptation, where we have much fewer annotated data in the target domain compared to the source domain.
Our semantic benefits from a two-stage coarse-to-fine framework, thus can provide different and accurate treatments for the two stages.
Experiments on a benchmark dataset show that our method consistently outperforms several popular domain adaptation strategies.
arXiv Detail & Related papers (2020-06-23T14:47:41Z) - Supervised Domain Adaptation using Graph Embedding [86.3361797111839]
Domain adaptation methods assume that distributions between the two domains are shifted and attempt to realign them.
We propose a generic framework based on graph embedding.
We show that the proposed approach leads to a powerful Domain Adaptation framework.
arXiv Detail & Related papers (2020-03-09T12:25:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.