Data Interpolants -- That's What Discriminators in Higher-order
Gradient-regularized GANs Are
- URL: http://arxiv.org/abs/2306.00785v1
- Date: Thu, 1 Jun 2023 15:16:36 GMT
- Title: Data Interpolants -- That's What Discriminators in Higher-order
Gradient-regularized GANs Are
- Authors: Siddarth Asokan and Chandra Sekhar Seelamantula
- Abstract summary: We show analytically, via the least-squares gradient (LSGAN) and Wasserstein (WGAN) GAN variants, that the discriminator optimization problem is one of $n$-dimensions.
The optimal discriminator, using variational Calculus, turns out to be the solution to a partial differential equation involving the iterated Laplacian or the polyharmonic operator.
We employ the Poly-WGAN discriminator to model the latent space distribution of the data with encoder-decoder-based GAN flavors such as Wasserstein autoencoders.
- Score: 20.03447539784024
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the problem of optimizing the discriminator in generative
adversarial networks (GANs) subject to higher-order gradient regularization. We
show analytically, via the least-squares (LSGAN) and Wasserstein (WGAN) GAN
variants, that the discriminator optimization problem is one of interpolation
in $n$-dimensions. The optimal discriminator, derived using variational
Calculus, turns out to be the solution to a partial differential equation
involving the iterated Laplacian or the polyharmonic operator. The solution is
implementable in closed-form via polyharmonic radial basis function (RBF)
interpolation. In view of the polyharmonic connection, we refer to the
corresponding GANs as Poly-LSGAN and Poly-WGAN. Through experimental validation
on multivariate Gaussians, we show that implementing the optimal RBF
discriminator in closed-form, with penalty orders $m \approx\lceil \frac{n}{2}
\rceil $, results in superior performance, compared to training GAN with
arbitrarily chosen discriminator architectures. We employ the Poly-WGAN
discriminator to model the latent space distribution of the data with
encoder-decoder-based GAN flavors such as Wasserstein autoencoders.
Related papers
- Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - On the Computation of the Gaussian Rate-Distortion-Perception Function [10.564071872770146]
We study the computation of the rate-distortion-perception function (RDPF) for a multivariate Gaussian source under mean squared error (MSE) distortion.
We provide the associated algorithmic realization, as well as the convergence and the rate of convergence characterization.
We corroborate our results with numerical simulations and draw connections to existing results.
arXiv Detail & Related papers (2023-11-15T18:34:03Z) - Stable Nonconvex-Nonconcave Training via Linear Interpolation [51.668052890249726]
This paper presents a theoretical analysis of linearahead as a principled method for stabilizing (large-scale) neural network training.
We argue that instabilities in the optimization process are often caused by the nonmonotonicity of the loss landscape and show how linear can help by leveraging the theory of nonexpansive operators.
arXiv Detail & Related papers (2023-10-20T12:45:12Z) - GANs Settle Scores! [16.317645727944466]
We propose a unified approach to analyzing the generator optimization through variational approach.
In $f$-divergence-minimizing GANs, we show that the optimal generator is the one that matches the score of its output distribution with that of the data distribution.
We propose novel alternatives to $f$-GAN and IPM-GAN training based on score and flow matching, and discriminator-guided Langevin sampling.
arXiv Detail & Related papers (2023-06-02T16:24:07Z) - Combating Mode Collapse in GANs via Manifold Entropy Estimation [70.06639443446545]
Generative Adversarial Networks (GANs) have shown compelling results in various tasks and applications.
We propose a novel training pipeline to address the mode collapse issue of GANs.
arXiv Detail & Related papers (2022-08-25T12:33:31Z) - Numerical Solution of Stiff Ordinary Differential Equations with Random
Projection Neural Networks [0.0]
We propose a numerical scheme based on Random Projection Neural Networks (RPNN) for the solution of Ordinary Differential Equations (ODEs)
We show that our proposed scheme yields good numerical approximation accuracy without being affected by the stiffness, thus outperforming in same cases the textttode45 and textttode15s functions.
arXiv Detail & Related papers (2021-08-03T15:49:17Z) - Hidden Convexity of Wasserstein GANs: Interpretable Generative Models
with Closed-Form Solutions [31.952858521063277]
We analyze the impact of Wasserstein GANs with two-layer neural network discriminators through the lens of convex duality.
We further demonstrate the power of different activation functions of discriminator.
arXiv Detail & Related papers (2021-07-12T18:33:49Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Discriminator Contrastive Divergence: Semi-Amortized Generative Modeling
by Exploring Energy of the Discriminator [85.68825725223873]
Generative Adversarial Networks (GANs) have shown great promise in modeling high dimensional data.
We introduce the Discriminator Contrastive Divergence, which is well motivated by the property of WGAN's discriminator.
We demonstrate the benefits of significant improved generation on both synthetic data and several real-world image generation benchmarks.
arXiv Detail & Related papers (2020-04-05T01:50:16Z) - Your GAN is Secretly an Energy-based Model and You Should use
Discriminator Driven Latent Sampling [106.68533003806276]
We show that sampling in latent space can be achieved by sampling in latent space according to an energy-based model induced by the sum of the latent prior log-density and the discriminator output score.
We show that Discriminator Driven Latent Sampling(DDLS) is highly efficient compared to previous methods which work in the high-dimensional pixel space.
arXiv Detail & Related papers (2020-03-12T23:33:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.