Towards Diverse and Faithful One-shot Adaption of Generative Adversarial
Networks
- URL: http://arxiv.org/abs/2207.08736v1
- Date: Mon, 18 Jul 2022 16:29:41 GMT
- Title: Towards Diverse and Faithful One-shot Adaption of Generative Adversarial
Networks
- Authors: Yabo Zhang, Mingshuai Yao, Yuxiang Wei, Zhilong Ji, Jinfeng Bai,
Wangmeng Zuo
- Abstract summary: One-shot generative domain adaption aims to transfer a pre-trained generator on one domain to a new domain using one reference image only.
We present a novel one-shot generative domain adaption method, i.e., DiFa, for diverse generation and faithful adaptation.
- Score: 54.80435295622583
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: One-shot generative domain adaption aims to transfer a pre-trained generator
on one domain to a new domain using one reference image only. However, it
remains very challenging for the adapted generator (i) to generate diverse
images inherited from the pre-trained generator while (ii) faithfully acquiring
the domain-specific attributes and styles of the reference image. In this
paper, we present a novel one-shot generative domain adaption method, i.e.,
DiFa, for diverse generation and faithful adaptation. For global-level
adaptation, we leverage the difference between the CLIP embedding of reference
image and the mean embedding of source images to constrain the target
generator. For local-level adaptation, we introduce an attentive style loss
which aligns each intermediate token of adapted image with its corresponding
token of the reference image. To facilitate diverse generation, selective
cross-domain consistency is introduced to select and retain the domain-sharing
attributes in the editing latent $\mathcal{W}+$ space to inherit the diversity
of pre-trained generator. Extensive experiments show that our method
outperforms the state-of-the-arts both quantitatively and qualitatively,
especially for the cases of large domain gaps. Moreover, our DiFa can easily be
extended to zero-shot generative domain adaption with appealing results. Code
is available at https://github.com/1170300521/DiFa.
Related papers
- UniHDA: A Unified and Versatile Framework for Multi-Modal Hybrid Domain Adaptation [22.003900281544766]
We propose UniHDA, a framework for generative hybrid domain adaptation with multi-modal references from multiple domains.
Our framework is generator-agnostic and versatile to multiple generators, e.g., StyleGAN, EG3D, and Diffusion Models.
arXiv Detail & Related papers (2024-01-23T09:49:24Z) - Few-shot Hybrid Domain Adaptation of Image Generators [14.779903669510846]
Few-shot Hybrid Domain Adaptation aims to acquire an adapted generator that preserves the integrated attributes of all target domains.
We introduce a discriminator-free framework that directly encodes different domains' images into well-separable subspaces.
Experiments show that our method can obtain numerous domain-specific attributes in a single adapted generator.
arXiv Detail & Related papers (2023-10-30T09:35:43Z) - Zero-shot Generative Model Adaptation via Image-specific Prompt Learning [41.344908073632986]
CLIP-guided image synthesis has shown appealing performance on adapting a pre-trained source-domain generator to an unseen target domain.
We propose an Image-specific Prompt Learning (IPL) method, which learns specific prompt vectors for each source-domain image.
IPL effectively improves the quality and diversity of synthesized images and alleviates the mode collapse.
arXiv Detail & Related papers (2023-04-06T14:48:13Z) - Generalized One-shot Domain Adaption of Generative Adversarial Networks [72.84435077616135]
The adaption of Generative Adversarial Network (GAN) aims to transfer a pre-trained GAN to a given domain with limited training data.
We consider that the adaptation from source domain to target domain can be decoupled into two parts: the transfer of global style like texture and color, and the emergence of new entities that do not belong to the source domain.
Our core objective is to constrain the gap between the internal distributions of the reference and syntheses by sliced Wasserstein distance.
arXiv Detail & Related papers (2022-09-08T09:24:44Z) - One-Shot Generative Domain Adaptation [39.17324951275831]
This work aims at transferring a Generative Adversarial Network (GAN) pre-trained on one image domain to a new domain referring to as few as just one target image.
arXiv Detail & Related papers (2021-11-18T18:55:08Z) - StyleGAN-NADA: CLIP-Guided Domain Adaptation of Image Generators [63.85888518950824]
We present a text-driven method that allows shifting a generative model to new domains.
We show that through natural language prompts and a few minutes of training, our method can adapt a generator across a multitude of domains.
arXiv Detail & Related papers (2021-08-02T14:46:46Z) - Few-shot Image Generation via Cross-domain Correspondence [98.2263458153041]
Training generative models, such as GANs, on a target domain containing limited examples can easily result in overfitting.
In this work, we seek to utilize a large source domain for pretraining and transfer the diversity information from source to target.
To further reduce overfitting, we present an anchor-based strategy to encourage different levels of realism over different regions in the latent space.
arXiv Detail & Related papers (2021-04-13T17:59:35Z) - TriGAN: Image-to-Image Translation for Multi-Source Domain Adaptation [82.52514546441247]
We propose the first approach for Multi-Source Domain Adaptation (MSDA) based on Generative Adversarial Networks.
Our method is inspired by the observation that the appearance of a given image depends on three factors: the domain, the style and the content.
We test our approach using common MSDA benchmarks, showing that it outperforms state-of-the-art methods.
arXiv Detail & Related papers (2020-04-19T05:07:22Z) - CrDoCo: Pixel-level Domain Transfer with Cross-Domain Consistency [119.45667331836583]
Unsupervised domain adaptation algorithms aim to transfer the knowledge learned from one domain to another.
We present a novel pixel-wise adversarial domain adaptation algorithm.
arXiv Detail & Related papers (2020-01-09T19:00:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.