Approximative Algorithms for Multi-Marginal Optimal Transport and
Free-Support Wasserstein Barycenters
- URL: http://arxiv.org/abs/2202.00954v1
- Date: Wed, 2 Feb 2022 10:59:54 GMT
- Title: Approximative Algorithms for Multi-Marginal Optimal Transport and
Free-Support Wasserstein Barycenters
- Authors: Johannes von Lindheim
- Abstract summary: We present two algorithms to approximate the solution of multi-marginal optimal transport (MOT) with squared Euclidean costs for $N$ discrete probability measures.
They are fast, memory-efficient and easy to implement and can be used with any sparse OT solver as a black box.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Computationally solving multi-marginal optimal transport (MOT) with squared
Euclidean costs for $N$ discrete probability measures has recently attracted
considerable attention, in part because of the correspondence of its solutions
with Wasserstein-$2$ barycenters, which have many applications in data science.
In general, this problem is NP-hard, calling for practical approximative
algorithms. While entropic regularization has been successfully applied to
approximate Wasserstein barycenters, this loses the sparsity of the optimal
solution, making it difficult to solve the MOT problem directly in practice
because of the curse of dimensionality. Thus, for obtaining barycenters, one
usually resorts to fixed-support restrictions to a grid, which is, however,
prohibitive in higher ambient dimensions $d$. In this paper, after analyzing
the relationship between MOT and barycenters, we present two algorithms to
approximate the solution of MOT directly, requiring mainly just $N-1$ standard
two-marginal OT computations. Thus, they are fast, memory-efficient and easy to
implement and can be used with any sparse OT solver as a black box. Moreover,
they produce sparse solutions and show promising numerical results. We analyze
these algorithms theoretically, proving upper and lower bounds for the relative
approximation error.
Related papers
- Statistical and Computational Guarantees of Kernel Max-Sliced Wasserstein Distances [9.608373793625107]
kernel max-sliced (KMS) Wasserstein distance is developed for this purpose.
We show that computing the KMS $2$-Wasserstein distance is NP-hard.
arXiv Detail & Related papers (2024-05-24T11:14:56Z) - Estimating Barycenters of Distributions with Neural Optimal Transport [93.28746685008093]
We propose a new scalable approach for solving the Wasserstein barycenter problem.
Our methodology is based on the recent Neural OT solver.
We also establish theoretical error bounds for our proposed approach.
arXiv Detail & Related papers (2024-02-06T09:17:07Z) - Covariance alignment: from maximum likelihood estimation to
Gromov-Wasserstein [27.2585517709102]
We show that the celebrated Gromov-Wasserstein algorithm from optimal transport is also minimax optimal.
These results give the first statistical justification for the deployment of the Gromov-Wasserstein algorithm in practice.
arXiv Detail & Related papers (2023-11-22T18:55:27Z) - An Oblivious Stochastic Composite Optimization Algorithm for Eigenvalue
Optimization Problems [76.2042837251496]
We introduce two oblivious mirror descent algorithms based on a complementary composite setting.
Remarkably, both algorithms work without prior knowledge of the Lipschitz constant or smoothness of the objective function.
We show how to extend our framework to scale and demonstrate the efficiency and robustness of our methods on large scale semidefinite programs.
arXiv Detail & Related papers (2023-06-30T08:34:29Z) - Linearized Wasserstein dimensionality reduction with approximation
guarantees [65.16758672591365]
LOT Wassmap is a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space.
We show that LOT Wassmap attains correct embeddings and that the quality improves with increased sample size.
We also show how LOT Wassmap significantly reduces the computational cost when compared to algorithms that depend on pairwise distance computations.
arXiv Detail & Related papers (2023-02-14T22:12:16Z) - Pseudonorm Approachability and Applications to Regret Minimization [73.54127663296906]
We convert high-dimensional $ell_infty$-approachability problems to low-dimensional pseudonorm approachability problems.
We develop an algorithmic theory of pseudonorm approachability, analogous to previous work on approachability for $ell$ and other norms.
arXiv Detail & Related papers (2023-02-03T03:19:14Z) - On the complexity of the optimal transport problem with graph-structured
cost [9.24979291231758]
Multi-marginal optimal transport (MOT) is a generalization of optimal transport to multiple marginals.
The usage of MOT has been largely impeded by its computational complexity which scales exponentially in the number of marginals.
arXiv Detail & Related papers (2021-10-01T19:29:59Z) - Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex
Decentralized Optimization Over Time-Varying Networks [79.16773494166644]
We consider the task of minimizing the sum of smooth and strongly convex functions stored in a decentralized manner across the nodes of a communication network.
We design two optimal algorithms that attain these lower bounds.
We corroborate the theoretical efficiency of these algorithms by performing an experimental comparison with existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-08T15:54:44Z) - Projection Robust Wasserstein Barycenter [36.97843660480747]
approximating the Wasserstein barycenter is numerically challenging because of the curse of dimensionality.
This paper proposes the projection robust Wasserstein barycenter (PRWB) that mitigates the curse of dimensionality.
arXiv Detail & Related papers (2021-02-05T19:23:35Z) - Continuous Wasserstein-2 Barycenter Estimation without Minimax
Optimization [94.18714844247766]
Wasserstein barycenters provide a geometric notion of the weighted average of probability measures based on optimal transport.
We present a scalable algorithm to compute Wasserstein-2 barycenters given sample access to the input measures.
arXiv Detail & Related papers (2021-02-02T21:01:13Z) - A Riemannian Block Coordinate Descent Method for Computing the
Projection Robust Wasserstein Distance [36.97843660480747]
The Wasserstein distance has become increasingly important in machine learning and deep learning.
A recently proposed approach to alleviate the curse of dimensionality is to project the sampled data onto a lower-dimensional subspace, and then compute the Wasserstein distance between the projected data.
However, this approach requires to solve a max-min problem over the Stiefel manifold, which is very challenging in practice.
We propose a novel block coordinate descent (RBCD) method to solve this problem, which is based on a novel reformulation of the regularized max-min problem over the Stiefel manifold.
arXiv Detail & Related papers (2020-12-09T17:47:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.