Real or Not Real, that is the Question
- URL: http://arxiv.org/abs/2002.05512v1
- Date: Wed, 12 Feb 2020 18:41:55 GMT
- Title: Real or Not Real, that is the Question
- Authors: Yuanbo Xiangli, Yubin Deng, Bo Dai, Chen Change Loy, Dahua Lin
- Abstract summary: We generalize the standard generative adversarial networks (GAN) to a new perspective by treating realness as a random variable.
In this framework, referred to as RealnessGAN, the discriminator outputs a distribution as the measure of realness.
It enables the basic DCGAN architecture to generate realistic images at 1024*1024 resolution when trained from scratch.
- Score: 165.82386565136107
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While generative adversarial networks (GAN) have been widely adopted in
various topics, in this paper we generalize the standard GAN to a new
perspective by treating realness as a random variable that can be estimated
from multiple angles. In this generalized framework, referred to as
RealnessGAN, the discriminator outputs a distribution as the measure of
realness. While RealnessGAN shares similar theoretical guarantees with the
standard GAN, it provides more insights on adversarial learning. Compared to
multiple baselines, RealnessGAN provides stronger guidance for the generator,
achieving improvements on both synthetic and real-world datasets. Moreover, it
enables the basic DCGAN architecture to generate realistic images at 1024*1024
resolution when trained from scratch.
Related papers
- Generalized Deepfake Attribution [1.4999444543328293]
Generative Adversarial Networks (GANs) can produce countless variations of GAN models depending on the seed used.
Existing methods for attributing deepfakes work well only if they have seen the specific GAN model during training.
We propose a generalized deepfake attribution network (GDA-N et) to attribute fake images to their respective GAN architectures.
arXiv Detail & Related papers (2024-06-26T12:04:09Z) - Towards Realistic Data Generation for Real-World Super-Resolution [58.88039242455039]
RealDGen is an unsupervised learning data generation framework designed for real-world super-resolution.
We develop content and degradation extraction strategies, which are integrated into a novel content-degradation decoupled diffusion model.
Experiments demonstrate that RealDGen excels in generating large-scale, high-quality paired data that mirrors real-world degradations.
arXiv Detail & Related papers (2024-06-11T13:34:57Z) - Distilling Representations from GAN Generator via Squeeze and Span [55.76208869775715]
We propose to distill knowledge from GAN generators by squeezing and spanning their representations.
We span the distilled representation of the synthetic domain to the real domain by also using real training data to remedy the mode collapse of GANs.
arXiv Detail & Related papers (2022-11-06T01:10:28Z) - Details or Artifacts: A Locally Discriminative Learning Approach to
Realistic Image Super-Resolution [28.00231586840797]
Single image super-resolution (SISR) with generative adversarial networks (GAN) has recently attracted increasing attention due to its potentials to generate rich details.
In this paper, we demonstrate that it is possible to train a GAN-based SISR model which can stably generate perceptually realistic details while inhibiting visual artifacts.
arXiv Detail & Related papers (2022-03-17T09:35:50Z) - Forward Super-Resolution: How Can GANs Learn Hierarchical Generative
Models for Real-World Distributions [66.05472746340142]
Generative networks (GAN) are among the most successful for learning high-complexity, real-world distributions.
In this paper we show how GANs can efficiently learn to the distribution of real-life images.
arXiv Detail & Related papers (2021-06-04T17:33:29Z) - Improving Generative Adversarial Networks with Local Coordinate Coding [150.24880482480455]
Generative adversarial networks (GANs) have shown remarkable success in generating realistic data from some predefined prior distribution.
In practice, semantic information might be represented by some latent distribution learned from data.
We propose an LCCGAN model with local coordinate coding (LCC) to improve the performance of generating data.
arXiv Detail & Related papers (2020-07-28T09:17:50Z) - Robust Generative Adversarial Network [37.015223009069175]
We aim to improve the generalization capability of GANs by promoting the local robustness within the small neighborhood of the training samples.
We design a robust optimization framework where the generator and discriminator compete with each other in a textitworst-case setting within a small Wasserstein ball.
We have proved that our robust method can obtain a tighter generalization upper bound than traditional GANs under mild assumptions.
arXiv Detail & Related papers (2020-04-28T07:37:01Z) - GANs with Conditional Independence Graphs: On Subadditivity of
Probability Divergences [70.30467057209405]
Generative Adversarial Networks (GANs) are modern methods to learn the underlying distribution of a data set.
GANs are designed in a model-free fashion where no additional information about the underlying distribution is available.
We propose a principled design of a model-based GAN that uses a set of simple discriminators on the neighborhoods of the Bayes-net/MRF.
arXiv Detail & Related papers (2020-03-02T04:31:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.