Wasserstein Iterative Networks for Barycenter Estimation
- URL: http://arxiv.org/abs/2201.12245v1
- Date: Fri, 28 Jan 2022 16:59:47 GMT
- Title: Wasserstein Iterative Networks for Barycenter Estimation
- Authors: Alexander Korotin, Vage Egiazarian, Lingxiao Li, Evgeny Burnaev
- Abstract summary: We present an algorithm to approximate the Wasserstein-2 barycenters of continuous measures via a generative model.
Based on the celebrity faces dataset, we construct Ave, celeba! dataset which can be used for quantitative evaluation of barycenter algorithms.
- Score: 80.23810439485078
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Wasserstein barycenters have become popular due to their ability to represent
the average of probability measures in a geometrically meaningful way. In this
paper, we present an algorithm to approximate the Wasserstein-2 barycenters of
continuous measures via a generative model. Previous approaches rely on
regularization (entropic/quadratic) which introduces bias or on input convex
neural networks which are not expressive enough for large-scale tasks. In
contrast, our algorithm does not introduce bias and allows using arbitrary
neural networks. In addition, based on the celebrity faces dataset, we
construct Ave, celeba! dataset which can be used for quantitative evaluation of
barycenter algorithms by using standard metrics of generative models such as
FID.
Related papers
- Estimating Barycenters of Distributions with Neural Optimal Transport [93.28746685008093]
We propose a new scalable approach for solving the Wasserstein barycenter problem.
Our methodology is based on the recent Neural OT solver.
We also establish theoretical error bounds for our proposed approach.
arXiv Detail & Related papers (2024-02-06T09:17:07Z) - Generalizing Backpropagation for Gradient-Based Interpretability [103.2998254573497]
We show that the gradient of a model is a special case of a more general formulation using semirings.
This observation allows us to generalize the backpropagation algorithm to efficiently compute other interpretable statistics.
arXiv Detail & Related papers (2023-07-06T15:19:53Z) - Online stochastic Newton methods for estimating the geometric median and
applications [0.0]
In the context of large samples, a small number of individuals might spoil basic statistical indicators like the mean.
This paper focuses on estimating the geometric median of a random variable, which is a robust indicator of central tendency.
arXiv Detail & Related papers (2023-04-03T07:47:20Z) - Projection Robust Wasserstein Barycenter [36.97843660480747]
approximating the Wasserstein barycenter is numerically challenging because of the curse of dimensionality.
This paper proposes the projection robust Wasserstein barycenter (PRWB) that mitigates the curse of dimensionality.
arXiv Detail & Related papers (2021-02-05T19:23:35Z) - Learning High Dimensional Wasserstein Geodesics [55.086626708837635]
We propose a new formulation and learning strategy for computing the Wasserstein geodesic between two probability distributions in high dimensions.
By applying the method of Lagrange multipliers to the dynamic formulation of the optimal transport (OT) problem, we derive a minimax problem whose saddle point is the Wasserstein geodesic.
We then parametrize the functions by deep neural networks and design a sample based bidirectional learning algorithm for training.
arXiv Detail & Related papers (2021-02-05T04:25:28Z) - Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization [94.18714844247766]
Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
arXiv Detail & Related papers (2021-02-02T21:01:13Z) - Continual Learning of Generative Models with Limited Data: From
Wasserstein-1 Barycenter to Adaptive Coalescence [22.82926450287203]
Learning generative models is challenging for a network edge node with limited data and computing power.
This study aims to develop a framework which systematically optimize continual learning of generative models.
arXiv Detail & Related papers (2021-01-22T17:15:39Z) - Continuous Regularized Wasserstein Barycenters [51.620781112674024]
We introduce a new dual formulation for the regularized Wasserstein barycenter problem.
We establish strong duality and use the corresponding primal-dual relationship to parametrize the barycenter implicitly using the dual potentials of regularized transport problems.
arXiv Detail & Related papers (2020-08-28T08:28:06Z) - Scalable Computations of Wasserstein Barycenter via Input Convex Neural
Networks [15.171726731041055]
Wasserstein Barycenter is a principled approach to represent the weighted mean of a given set of probability distributions.
We present a novel scalable algorithm to approximate the Wasserstein Barycenters aiming at high-dimensional applications in machine learning.
arXiv Detail & Related papers (2020-07-08T22:41:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.