Optimal 1-Wasserstein Distance for WGANs
- URL: http://arxiv.org/abs/2201.02824v2
- Date: Thu, 5 Oct 2023 15:51:11 GMT
- Title: Optimal 1-Wasserstein Distance for WGANs
- Authors: Arthur St\'ephanovitch, Ugo Tanielian, Beno\^it Cadre, Nicolas
Klutchnikoff, G\'erard Biau
- Abstract summary: We provide a thorough analysis of Wasserstein GANs (WGANs) in both the finite sample and regimes.
We derive in passing new results on optimal transport theory in the semi-discrete setting.
- Score: 2.1174215880331775
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The mathematical forces at work behind Generative Adversarial Networks raise
challenging theoretical issues. Motivated by the important question of
characterizing the geometrical properties of the generated distributions, we
provide a thorough analysis of Wasserstein GANs (WGANs) in both the finite
sample and asymptotic regimes. We study the specific case where the latent
space is univariate and derive results valid regardless of the dimension of the
output space. We show in particular that for a fixed sample size, the optimal
WGANs are closely linked with connected paths minimizing the sum of the squared
Euclidean distances between the sample points. We also highlight the fact that
WGANs are able to approach (for the 1-Wasserstein distance) the target
distribution as the sample size tends to infinity, at a given convergence rate
and provided the family of generative Lipschitz functions grows appropriately.
We derive in passing new results on optimal transport theory in the
semi-discrete setting.
Related papers
- A Wasserstein perspective of Vanilla GANs [0.0]
Vanilla GANs are generalizations of Wasserstein GANs.
In particular, we obtain an oracle inequality for Vanilla GANs in Wasserstein distance.
We conclude a rate of convergence for Vanilla GANs as well as Wasserstein GANs as estimators of the unknown probability distribution.
arXiv Detail & Related papers (2024-03-22T16:04:26Z) - Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and
Optimal Algorithms [64.10576998630981]
We show the first tight characterization of the optimal Hessian-dependent sample complexity.
A Hessian-independent algorithm universally achieves the optimal sample complexities for all Hessian instances.
The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions.
arXiv Detail & Related papers (2023-06-21T17:03:22Z) - PAC-Bayesian Generalization Bounds for Adversarial Generative Models [2.828173677501078]
We develop generalization bounds for models based on the Wasserstein distance and the total variation distance.
Our results naturally apply to Wasserstein GANs and Energy-Based GANs, and our bounds provide new training objectives for these two.
arXiv Detail & Related papers (2023-02-17T15:25:49Z) - Robust Estimation for Nonparametric Families via Generative Adversarial
Networks [92.64483100338724]
We provide a framework for designing Generative Adversarial Networks (GANs) to solve high dimensional robust statistics problems.
Our work extend these to robust mean estimation, second moment estimation, and robust linear regression.
In terms of techniques, our proposed GAN losses can be viewed as a smoothed and generalized Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2022-02-02T20:11:33Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Convergence of Gaussian-smoothed optimal transport distance with
sub-gamma distributions and dependent samples [12.77426855794452]
This paper provides convergence guarantees for estimating the GOT distance under more general settings.
A key step in our analysis is to show that the GOT distance is dominated by a family of kernel maximum mean discrepancy distances.
arXiv Detail & Related papers (2021-02-28T04:30:23Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - Two-sample Test using Projected Wasserstein Distance [18.46110328123008]
We develop a projected Wasserstein distance for the two-sample test, a fundamental problem in statistics and machine learning.
A key contribution is to couple optimal projection to find the low dimensional linear mapping to maximize the Wasserstein distance between projected probability distributions.
arXiv Detail & Related papers (2020-10-22T18:08:58Z) - On Projection Robust Optimal Transport: Sample Complexity and Model
Misspecification [101.0377583883137]
Projection robust (PR) OT seeks to maximize the OT cost between two measures by choosing a $k$-dimensional subspace onto which they can be projected.
Our first contribution is to establish several fundamental statistical properties of PR Wasserstein distances.
Next, we propose the integral PR Wasserstein (IPRW) distance as an alternative to the PRW distance, by averaging rather than optimizing on subspaces.
arXiv Detail & Related papers (2020-06-22T14:35:33Z) - Projection Robust Wasserstein Distance and Riemannian Optimization [107.93250306339694]
We show that projection robustly solidstein (PRW) is a robust variant of Wasserstein projection (WPP)
This paper provides a first step into the computation of the PRW distance and provides the links between their theory and experiments on and real data.
arXiv Detail & Related papers (2020-06-12T20:40:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.