Domain Expansion of Image Generators
- URL: http://arxiv.org/abs/2301.05225v2
- Date: Mon, 17 Apr 2023 11:24:07 GMT
- Title: Domain Expansion of Image Generators
- Authors: Yotam Nitzan, Micha\"el Gharbi, Richard Zhang, Taesung Park, Jun-Yan
Zhu, Daniel Cohen-Or, Eli Shechtman
- Abstract summary: We propose a new task - domain expansion - to address this.
Given a pretrained generator and novel (but related) domains, we expand the generator to jointly model all domains, old and new, harmoniously.
Using our expansion method, one "expanded" model can supersede numerous domain-specific models, without expanding the model size.
- Score: 80.8601805917418
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Can one inject new concepts into an already trained generative model, while
respecting its existing structure and knowledge? We propose a new task - domain
expansion - to address this. Given a pretrained generator and novel (but
related) domains, we expand the generator to jointly model all domains, old and
new, harmoniously. First, we note the generator contains a meaningful,
pretrained latent space. Is it possible to minimally perturb this hard-earned
representation, while maximally representing the new domains? Interestingly, we
find that the latent space offers unused, "dormant" directions, which do not
affect the output. This provides an opportunity: By "repurposing" these
directions, we can represent new domains without perturbing the original
representation. In fact, we find that pretrained generators have the capacity
to add several - even hundreds - of new domains! Using our expansion method,
one "expanded" model can supersede numerous domain-specific models, without
expanding the model size. Additionally, a single expanded generator natively
supports smooth transitions between domains, as well as composition of domains.
Code and project page available at
https://yotamnitzan.github.io/domain-expansion/.
Related papers
- DomainGallery: Few-shot Domain-driven Image Generation by Attribute-centric Finetuning [51.66633704537334]
DomainGallery is a few-shot domain-driven image generation method.
It features prior attribute erasure, attribute disentanglement, regularization and enhancement.
Experiments are given to validate the superior performance of DomainGallery on a variety of domain-driven generation scenarios.
arXiv Detail & Related papers (2024-11-07T09:55:36Z) - UniHDA: A Unified and Versatile Framework for Multi-Modal Hybrid Domain Adaptation [22.003900281544766]
We propose UniHDA, a framework for generative hybrid domain adaptation with multi-modal references from multiple domains.
Our framework is generator-agnostic and versatile to multiple generators, e.g., StyleGAN, EG3D, and Diffusion Models.
arXiv Detail & Related papers (2024-01-23T09:49:24Z) - Domain-Scalable Unpaired Image Translation via Latent Space Anchoring [88.7642967393508]
Unpaired image-to-image translation (UNIT) aims to map images between two visual domains without paired training data.
We propose a new domain-scalable UNIT method, termed as latent space anchoring.
Our method anchors images of different domains to the same latent space of frozen GANs by learning lightweight encoder and regressor models.
In the inference phase, the learned encoders and decoders of different domains can be arbitrarily combined to translate images between any two domains without fine-tuning.
arXiv Detail & Related papers (2023-06-26T17:50:02Z) - Domain Re-Modulation for Few-Shot Generative Domain Adaptation [71.47730150327818]
Generative Domain Adaptation (GDA) involves transferring a pre-trained generator from one domain to a new domain using only a few reference images.
Inspired by the way human brains acquire knowledge in new domains, we present an innovative generator structure called Domain Re-Modulation (DoRM)
DoRM not only meets the criteria of high quality, large synthesis diversity, and cross-domain consistency, but also incorporates memory and domain association.
arXiv Detail & Related papers (2023-02-06T03:55:35Z) - Open SESAME: Fighting Botnets with Seed Reconstructions of Domain
Generation Algorithms [0.0]
Bots can generate pseudorandom domain names using Domain Generation Algorithms (DGAs)
A cyber criminal can register such domains to establish periodically changing rendezvous points with the bots.
We introduce SESAME, a system that combines the two above-mentioned approaches and contains a module for automatic Seed Reconstruction.
arXiv Detail & Related papers (2023-01-12T14:25:31Z) - HyperDomainNet: Universal Domain Adaptation for Generative Adversarial
Networks [0.0]
We introduce a novel domain-modulation technique that allows to optimize only 6 thousand-dimensional vector instead of 30 million weights of StyleGAN2 to adapt to a target domain.
Inspired by the reduction in the size of the optimizing parameter space we consider the problem of multi-domain adaptation of GANs.
We propose the HyperDomainNet that is a hypernetwork that predicts our parameterization given the target domain.
arXiv Detail & Related papers (2022-10-17T09:27:39Z) - Towards Diverse and Faithful One-shot Adaption of Generative Adversarial
Networks [54.80435295622583]
One-shot generative domain adaption aims to transfer a pre-trained generator on one domain to a new domain using one reference image only.
We present a novel one-shot generative domain adaption method, i.e., DiFa, for diverse generation and faithful adaptation.
arXiv Detail & Related papers (2022-07-18T16:29:41Z) - StyleGAN-NADA: CLIP-Guided Domain Adaptation of Image Generators [63.85888518950824]
We present a text-driven method that allows shifting a generative model to new domains.
We show that through natural language prompts and a few minutes of training, our method can adapt a generator across a multitude of domains.
arXiv Detail & Related papers (2021-08-02T14:46:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.