Continuous Conditional Generative Adversarial Networks (cGAN) with
Generator Regularization
- URL: http://arxiv.org/abs/2103.14884v1
- Date: Sat, 27 Mar 2021 12:01:56 GMT
- Title: Continuous Conditional Generative Adversarial Networks (cGAN) with
Generator Regularization
- Authors: Yufeng Zheng, Yunkai Zhang, Zeyu Zheng
- Abstract summary: We propose a simple generator regularization term on the GAN generator loss in the form of Lipschitz penalty.
We analyze the effect of the proposed regularization term and demonstrate its robust performance on a range of synthetic and real-world tasks.
- Score: 7.676408770854476
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conditional Generative Adversarial Networks are known to be difficult to
train, especially when the conditions are continuous and high-dimensional. To
partially alleviate this difficulty, we propose a simple generator
regularization term on the GAN generator loss in the form of Lipschitz penalty.
Thus, when the generator is fed with neighboring conditions in the continuous
space, the regularization term will leverage the neighbor information and push
the generator to generate samples that have similar conditional distributions
for each neighboring condition. We analyze the effect of the proposed
regularization term and demonstrate its robust performance on a range of
synthetic and real-world tasks.
Related papers
- Recursive perturbation approach to time-convolutionless master equations: Explicit construction of generalized Lindblad generators for arbitrary open systems [44.99833362998488]
We develop a perturbative expansion for the time-convolutionless (TCL) generator of an open quantum system in a generalized Lindblad form.<n>This formulation provides a systematic approach to derive the generator at arbitrary order while preserving a Lindblad-like structure.<n>To validate the method and show its effectiveness in addressing non-Markovian dynamics and strong-coupling effects, we compute the generator explicitly up to fourth order.
arXiv Detail & Related papers (2025-06-04T15:51:26Z) - Constrained Language Generation with Discrete Diffusion Models [61.81569616239755]
We present Constrained Discrete Diffusion (CDD), a novel method for enforcing constraints on natural language by integrating discrete diffusion models with differentiable optimization.
We show how this technique can be applied to satisfy a variety of natural language constraints, including (i) toxicity mitigation by preventing harmful content from emerging, (ii) character and sequence level lexical constraints, and (iii) novel molecule sequence generation with specific property adherence.
arXiv Detail & Related papers (2025-03-12T19:48:12Z) - Scalable Equilibrium Sampling with Sequential Boltzmann Generators [60.00515282300297]
We extend the Boltzmann generator framework with two key contributions.<n>The first is a highly efficient Transformer-based normalizing flow operating directly on all-atom Cartesian coordinates.<n>In particular, we perform inference-time scaling of flow samples using a continuous-time variant of sequential Monte Carlo.
arXiv Detail & Related papers (2025-02-25T18:59:13Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Non-autoregressive Sequence-to-Sequence Vision-Language Models [63.77614880533488]
We propose a parallel decoding sequence-to-sequence vision-language model that marginalizes over multiple inference paths in the decoder.
The model achieves performance on-par with its state-of-the-art autoregressive counterpart, but is faster at inference time.
arXiv Detail & Related papers (2024-03-04T17:34:59Z) - Time-series Generation by Contrastive Imitation [87.51882102248395]
We study a generative framework that seeks to combine the strengths of both: Motivated by a moment-matching objective to mitigate compounding error, we optimize a local (but forward-looking) transition policy.
At inference, the learned policy serves as the generator for iterative sampling, and the learned energy serves as a trajectory-level measure for evaluating sample quality.
arXiv Detail & Related papers (2023-11-02T16:45:25Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Shared Loss between Generators of GANs [7.33811357166334]
Generative adversarial networks are generative models that are capable of replicating the implicit probability distribution of the input data with high accuracy.
Traditionally, GANs consist of a Generator and a Discriminator which interact with each other to produce highly realistic artificial data.
We show that this causes a dramatic reduction in the training time for GANs without affecting its performance.
arXiv Detail & Related papers (2022-11-14T09:47:42Z) - Mode Penalty Generative Adversarial Network with adapted Auto-encoder [0.15229257192293197]
We propose a mode penalty GAN combined with pre-trained auto encoder for explicit representation of generated and real data samples in encoded space.
We demonstrate that applying the proposed method to GANs helps generator's optimization becoming more stable and having faster convergence through experimental evaluations.
arXiv Detail & Related papers (2020-11-16T03:39:53Z) - GANs with Variational Entropy Regularizers: Applications in Mitigating
the Mode-Collapse Issue [95.23775347605923]
Building on the success of deep learning, Generative Adversarial Networks (GANs) provide a modern approach to learn a probability distribution from observed samples.
GANs often suffer from the mode collapse issue where the generator fails to capture all existing modes of the input distribution.
We take an information-theoretic approach and maximize a variational lower bound on the entropy of the generated samples to increase their diversity.
arXiv Detail & Related papers (2020-09-24T19:34:37Z) - Conditional Hybrid GAN for Sequence Generation [56.67961004064029]
We propose a novel conditional hybrid GAN (C-Hybrid-GAN) to solve this issue.
We exploit the Gumbel-Softmax technique to approximate the distribution of discrete-valued sequences.
We demonstrate that the proposed C-Hybrid-GAN outperforms the existing methods in context-conditioned discrete-valued sequence generation.
arXiv Detail & Related papers (2020-09-18T03:52:55Z) - Learning disconnected manifolds: a no GANs land [15.4867805276559]
Generative AdversarialNetworks make use of a unimodal latent distribution transformed by a continuous generator.
We establish a no free lunch theorem for the disconnected manifold learning stating an upper bound on the precision of the targeted distribution.
We derive a rejection sampling method based on the norm of generators Jacobian and show its efficiency on several generators including BigGAN.
arXiv Detail & Related papers (2020-06-08T13:45:22Z) - Smoothness and Stability in GANs [21.01604897837572]
Generative adversarial networks, or GANs, commonly display unstable behavior during training.
We develop a principled theoretical framework for understanding the stability of various types of GANs.
arXiv Detail & Related papers (2020-02-11T03:08:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.