Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization
- URL: http://arxiv.org/abs/2102.01752v1
- Date: Tue, 2 Feb 2021 21:01:13 GMT
- Title: Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization
- Authors: Alexander Korotin, Lingxiao Li, Justin Solomon, Evgeny Burnaev
- Abstract summary: Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
- Score: 94.18714844247766
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Wasserstein barycenters provide a geometric notion of the weighted average of
probability measures based on optimal transport. In this paper, we present a
scalable algorithm to compute Wasserstein-2 barycenters given sample access to
the input measures, which are not restricted to being discrete. While past
approaches rely on entropic or quadratic regularization, we employ input convex
neural networks and cycle-consistency regularization to avoid introducing bias.
As a result, our approach does not resort to minimax optimization. We provide
theoretical analysis on error bounds as well as empirical evidence of the
effectiveness of the proposed approach in low-dimensional qualitative scenarios
and high-dimensional quantitative experiments.
Related papers
- Estimating Barycenters of Distributions with Neural Optimal Transport [93.28746685008093]
We propose a new scalable approach for solving the Wasserstein barycenter problem.
Our methodology is based on the recent Neural OT solver.
We also establish theoretical error bounds for our proposed approach.
arXiv Detail & Related papers (2024-02-06T09:17:07Z) - Geometry-Aware Normalizing Wasserstein Flows for Optimal Causal
Inference [0.0]
This paper presents a groundbreaking approach to causal inference by integrating continuous normalizing flows with parametric submodels.
We leverage optimal transport and Wasserstein gradient flows to develop causal inference methodologies with minimal variance in finite-sample settings.
Preliminary experiments showcase our method's superiority, yielding lower mean-squared errors compared to standard flows.
arXiv Detail & Related papers (2023-11-30T18:59:05Z) - Sobolev Space Regularised Pre Density Models [51.558848491038916]
We propose a new approach to non-parametric density estimation that is based on regularizing a Sobolev norm of the density.
This method is statistically consistent, and makes the inductive validation model clear and consistent.
arXiv Detail & Related papers (2023-07-25T18:47:53Z) - Wasserstein Iterative Networks for Barycenter Estimation [80.23810439485078]
We present an algorithm to approximate the Wasserstein-2 barycenters of continuous measures via a generative model.
Based on the celebrity faces dataset, we construct Ave, celeba! dataset which can be used for quantitative evaluation of barycenter algorithms.
arXiv Detail & Related papers (2022-01-28T16:59:47Z) - Streaming computation of optimal weak transport barycenters [13.664682865991255]
We provide a theoretical analysis of the weak barycenter and its relationship to the classic Wasserstein barycenter.
We provide iterative algorithms to compute a weak barycenter for either finite or infinite families of arbitrary measures.
arXiv Detail & Related papers (2021-02-26T10:08:02Z) - Continuous Regularized Wasserstein Barycenters [51.620781112674024]
We introduce a new dual formulation for the regularized Wasserstein barycenter problem.
We establish strong duality and use the corresponding primal-dual relationship to parametrize the barycenter implicitly using the dual potentials of regularized transport problems.
arXiv Detail & Related papers (2020-08-28T08:28:06Z) - Scalable Computations of Wasserstein Barycenter via Input Convex Neural
Networks [15.171726731041055]
Wasserstein Barycenter is a principled approach to represent the weighted mean of a given set of probability distributions.
We present a novel scalable algorithm to approximate the Wasserstein Barycenters aiming at high-dimensional applications in machine learning.
arXiv Detail & Related papers (2020-07-08T22:41:18Z) - Distributed Averaging Methods for Randomized Second Order Optimization [54.51566432934556]
We consider distributed optimization problems where forming the Hessian is computationally challenging and communication is a bottleneck.
We develop unbiased parameter averaging methods for randomized second order optimization that employ sampling and sketching of the Hessian.
We also extend the framework of second order averaging methods to introduce an unbiased distributed optimization framework for heterogeneous computing systems.
arXiv Detail & Related papers (2020-02-16T09:01:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.