DINO: A Conditional Energy-Based GAN for Domain Translation
- URL: http://arxiv.org/abs/2102.09281v1
- Date: Thu, 18 Feb 2021 11:52:45 GMT
- Title: DINO: A Conditional Energy-Based GAN for Domain Translation
- Authors: Konstantinos Vougioukas, Stavros Petridis and Maja Pantic
- Abstract summary: Domain translation is the process of transforming data from one domain to another while preserving the common semantics.
Some of the most popular domain translation systems are based on conditional generative adversarial networks.
We propose a new framework, where two networks are simultaneously trained, in a supervised manner, to perform domain translation in opposite directions.
- Score: 67.9879720396872
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Domain translation is the process of transforming data from one domain to
another while preserving the common semantics. Some of the most popular domain
translation systems are based on conditional generative adversarial networks,
which use source domain data to drive the generator and as an input to the
discriminator. However, this approach does not enforce the preservation of
shared semantics since the conditional input can often be ignored by the
discriminator. We propose an alternative method for conditioning and present a
new framework, where two networks are simultaneously trained, in a supervised
manner, to perform domain translation in opposite directions. Our method is not
only better at capturing the shared information between two domains but is more
generic and can be applied to a broader range of problems. The proposed
framework performs well even in challenging cross-modal translations, such as
video-driven speech reconstruction, for which other systems struggle to
maintain correspondence.
Related papers
- Domain-Agnostic Mutual Prompting for Unsupervised Domain Adaptation [27.695825570272874]
Conventional Unsupervised Domain Adaptation (UDA) strives to minimize distribution discrepancy between domains.
We propose Domain-Agnostic Mutual Prompting (DAMP) to exploit domain-invariant semantics.
Experiments on three UDA benchmarks demonstrate the superiority of DAMP over state-of-the-art approaches.
arXiv Detail & Related papers (2024-03-05T12:06:48Z) - Unsupervised Domain Adaptation for Semantic Segmentation using One-shot
Image-to-Image Translation via Latent Representation Mixing [9.118706387430883]
We propose a new unsupervised domain adaptation method for the semantic segmentation of very high resolution images.
An image-to-image translation paradigm is proposed, based on an encoder-decoder principle where latent content representations are mixed across domains.
Cross-city comparative experiments have shown that the proposed method outperforms state-of-the-art domain adaptation methods.
arXiv Detail & Related papers (2022-12-07T18:16:17Z) - Cyclically Disentangled Feature Translation for Face Anti-spoofing [61.70377630461084]
We propose a novel domain adaptation method called cyclically disentangled feature translation network (CDFTN)
CDFTN generates pseudo-labeled samples that possess: 1) source domain-invariant liveness features and 2) target domain-specific content features, which are disentangled through domain adversarial training.
A robust classifier is trained based on the synthetic pseudo-labeled images under the supervision of source domain labels.
arXiv Detail & Related papers (2022-12-07T14:12:34Z) - Generalized One-shot Domain Adaption of Generative Adversarial Networks [72.84435077616135]
The adaption of Generative Adversarial Network (GAN) aims to transfer a pre-trained GAN to a given domain with limited training data.
We consider that the adaptation from source domain to target domain can be decoupled into two parts: the transfer of global style like texture and color, and the emergence of new entities that do not belong to the source domain.
Our core objective is to constrain the gap between the internal distributions of the reference and syntheses by sliced Wasserstein distance.
arXiv Detail & Related papers (2022-09-08T09:24:44Z) - So Different Yet So Alike! Constrained Unsupervised Text Style Transfer [54.4773992696361]
We introduce a method for constrained unsupervised text style transfer by introducing two complementary losses to the generative adversarial network (GAN) family of models.
Unlike the competing losses used in GANs, we introduce cooperative losses where the discriminator and the generator cooperate and reduce the same loss.
We show that the complementary cooperative losses improve text quality, according to both automated and human evaluation measures.
arXiv Detail & Related papers (2022-05-09T07:46:40Z) - Contrastive Learning and Self-Training for Unsupervised Domain
Adaptation in Semantic Segmentation [71.77083272602525]
UDA attempts to provide efficient knowledge transfer from a labeled source domain to an unlabeled target domain.
We propose a contrastive learning approach that adapts category-wise centroids across domains.
We extend our method with self-training, where we use a memory-efficient temporal ensemble to generate consistent and reliable pseudo-labels.
arXiv Detail & Related papers (2021-05-05T11:55:53Z) - Addressing Zero-Resource Domains Using Document-Level Context in Neural
Machine Translation [80.40677540516616]
We show that when in-domain parallel data is not available, access to document-level context enables better capturing of domain generalities.
We present two document-level Transformer models which are capable of using large context sizes.
arXiv Detail & Related papers (2020-04-30T16:28:19Z) - Dual Adversarial Domain Adaptation [6.69797982848003]
Unsupervised domain adaptation aims at transferring knowledge from the labeled source domain to the unlabeled target domain.
Recent experiments have shown that when the discriminator is provided with domain information in both domains, it is able to preserve the complex multimodal information.
We adopt a discriminator with $2K$-dimensional output to perform both domain-level and class-level alignments simultaneously in a single discriminator.
arXiv Detail & Related papers (2020-01-01T07:10:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.