Learning Disentangled Semantic Representation for Domain Adaptation
- URL: http://arxiv.org/abs/2012.11807v1
- Date: Tue, 22 Dec 2020 03:03:36 GMT
- Title: Learning Disentangled Semantic Representation for Domain Adaptation
- Authors: Ruichu Cai, Zijian Li, Pengfei Wei, Jie Qiao, Kun Zhang, Zhifeng Hao
- Abstract summary: We aim to extract the domain invariant semantic information in the latent disentangled semantic representation of the data.
Under the above assumption, we employ a variational auto-encoder to reconstruct the semantic latent variables and domain latent variables.
We devise a dual adversarial network to disentangle these two sets of reconstructed latent variables.
- Score: 39.055191615410244
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Domain adaptation is an important but challenging task. Most of the existing
domain adaptation methods struggle to extract the domain-invariant
representation on the feature space with entangling domain information and
semantic information. Different from previous efforts on the entangled feature
space, we aim to extract the domain invariant semantic information in the
latent disentangled semantic representation (DSR) of the data. In DSR, we
assume the data generation process is controlled by two independent sets of
variables, i.e., the semantic latent variables and the domain latent variables.
Under the above assumption, we employ a variational auto-encoder to reconstruct
the semantic latent variables and domain latent variables behind the data. We
further devise a dual adversarial network to disentangle these two sets of
reconstructed latent variables. The disentangled semantic latent variables are
finally adapted across the domains. Experimental studies testify that our model
yields state-of-the-art performance on several domain adaptation benchmark
datasets.
Related papers
- Unsupervised Multiple Domain Translation through Controlled
Disentanglement in Variational Autoencoder [1.7611027732647493]
Unsupervised Multiple Domain Translation is the task of transforming data from one domain to other domains without having paired data to train the systems.
Our proposal relies on a modified version of a Variational Autoencoder.
One of this latent variables is imposed to depend exclusively on the domain, while the other one must depend on the rest of the variability factors of the data.
arXiv Detail & Related papers (2024-01-17T12:43:28Z) - Adversarial Bi-Regressor Network for Domain Adaptive Regression [52.5168835502987]
It is essential to learn a cross-domain regressor to mitigate the domain shift.
This paper proposes a novel method Adversarial Bi-Regressor Network (ABRNet) to seek more effective cross-domain regression model.
arXiv Detail & Related papers (2022-09-20T18:38:28Z) - Identifiable Latent Causal Content for Domain Adaptation under Latent Covariate Shift [82.14087963690561]
Multi-source domain adaptation (MSDA) addresses the challenge of learning a label prediction function for an unlabeled target domain.
We present an intricate causal generative model by introducing latent noises across domains, along with a latent content variable and a latent style variable.
The proposed approach showcases exceptional performance and efficacy on both simulated and real-world datasets.
arXiv Detail & Related papers (2022-08-30T11:25:15Z) - Vector-Decomposed Disentanglement for Domain-Invariant Object Detection [75.64299762397268]
We try to disentangle domain-invariant representations from domain-specific representations.
In the experiment, we evaluate our method on the single- and compound-target case.
arXiv Detail & Related papers (2021-08-15T07:58:59Z) - Self-Adversarial Disentangling for Specific Domain Adaptation [52.1935168534351]
Domain adaptation aims to bridge the domain shifts between the source and target domains.
Recent methods typically do not consider explicit prior knowledge on a specific dimension.
arXiv Detail & Related papers (2021-08-08T02:36:45Z) - Semi-Supervised Disentangled Framework for Transferable Named Entity
Recognition [27.472171967604602]
We present a semi-supervised framework for transferable NER, which disentangles the domain-invariant latent variables and domain-specific latent variables.
Our model can obtain state-of-the-art performance with cross-domain and cross-lingual NER benchmark data sets.
arXiv Detail & Related papers (2020-12-22T02:55:04Z) - Cross-Domain Latent Modulation for Variational Transfer Learning [1.9212368803706577]
We propose a cross-domain latent modulation mechanism within a variational autoencoders (VAE) framework to enable improved transfer learning.
We apply the proposed model to a number of transfer learning tasks including unsupervised domain adaptation and image-to-image translation.
arXiv Detail & Related papers (2020-12-21T22:45:00Z) - Domain Adaptation for Semantic Parsing [68.81787666086554]
We propose a novel semantic for domain adaptation, where we have much fewer annotated data in the target domain compared to the source domain.
Our semantic benefits from a two-stage coarse-to-fine framework, thus can provide different and accurate treatments for the two stages.
Experiments on a benchmark dataset show that our method consistently outperforms several popular domain adaptation strategies.
arXiv Detail & Related papers (2020-06-23T14:47:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.