On Unbalanced Optimal Transport: An Analysis of Sinkhorn Algorithm
- URL: http://arxiv.org/abs/2002.03293v2
- Date: Thu, 19 Nov 2020 04:02:46 GMT
- Title: On Unbalanced Optimal Transport: An Analysis of Sinkhorn Algorithm
- Authors: Khiem Pham and Khang Le and Nhat Ho and Tung Pham and Hung Bui
- Abstract summary: We show that the complexity of the Sinkhorn algorithm for finding an $varepsilon$-approximate solution to the UOT problem is of order $widetildemathcalO(n2/ varepsilon)$, which is near-linear time.
Our proof technique is based on the geometric convergence of the Sinkhorn updates to the optimal dual solution of the entropic regularized UOT problem and some properties of the primal solution.
- Score: 24.09910756628079
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We provide a computational complexity analysis for the Sinkhorn algorithm
that solves the entropic regularized Unbalanced Optimal Transport (UOT) problem
between two measures of possibly different masses with at most $n$ components.
We show that the complexity of the Sinkhorn algorithm for finding an
$\varepsilon$-approximate solution to the UOT problem is of order
$\widetilde{\mathcal{O}}(n^2/ \varepsilon)$, which is near-linear time. To the
best of our knowledge, this complexity is better than the complexity of the
Sinkhorn algorithm for solving the Optimal Transport (OT) problem, which is of
order $\widetilde{\mathcal{O}}(n^2/\varepsilon^2)$. Our proof technique is
based on the geometric convergence of the Sinkhorn updates to the optimal dual
solution of the entropic regularized UOT problem and some properties of the
primal solution. It is also different from the proof for the complexity of the
Sinkhorn algorithm for approximating the OT problem since the UOT solution does
not have to meet the marginal constraints.
Related papers
- Obtaining Lower Query Complexities through Lightweight Zeroth-Order Proximal Gradient Algorithms [65.42376001308064]
We propose two variance reduced ZO estimators for complex gradient problems.
We improve the state-of-the-art function complexities from $mathcalOleft(minfracdn1/2epsilon2, fracdepsilon3right)$ to $tildecalOleft(fracdepsilon2right)$.
arXiv Detail & Related papers (2024-10-03T15:04:01Z) - A Fully Parameter-Free Second-Order Algorithm for Convex-Concave Minimax Problems with Optimal Iteration Complexity [2.815239177328595]
We propose a Lipschitz-free cubic regularization (LF-CR) algorithm for solving the convex-concave minimax optimization problem.
We also propose a fully parameter-free cubic regularization (FF-CR) algorithm that does not require any parameters of the problem.
To the best of our knowledge, the proposed FF-CR algorithm is the first completely parameter-free second-order algorithm for solving convex-concave minimax optimization problems.
arXiv Detail & Related papers (2024-07-04T01:46:07Z) - On Partial Optimal Transport: Revising the Infeasibility of Sinkhorn and
Efficient Gradient Methods [17.14725907264431]
This paper studies the Partial Optimal Transport (POT) problem between two unbalanced measures with at most $n$ supports.
We propose a novel rounding algorithm for POT, and then provide a feasible Sinkhorn procedure with a revised complexity of $mathcalwidetilde O(n2/varepsilon4)$.
arXiv Detail & Related papers (2023-12-21T15:56:09Z) - Near-Optimal Bounds for Learning Gaussian Halfspaces with Random
Classification Noise [50.64137465792738]
We show that any efficient SQ algorithm for the problem requires sample complexity at least $Omega(d1/2/(maxp, epsilon)2)$.
Our lower bound suggests that this quadratic dependence on $1/epsilon$ is inherent for efficient algorithms.
arXiv Detail & Related papers (2023-07-13T18:59:28Z) - A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization [53.044526424637866]
In this paper we consider finding an approximate second-order stationary point (SOSP) that minimizes a twice different subject general non conic optimization.
In particular, we propose a Newton-CG based-augmentedconjugate method for finding an approximate SOSP.
arXiv Detail & Related papers (2023-01-10T20:43:29Z) - Mind the gap: Achieving a super-Grover quantum speedup by jumping to the
end [114.3957763744719]
We present a quantum algorithm that has rigorous runtime guarantees for several families of binary optimization problems.
We show that the algorithm finds the optimal solution in time $O*(2(0.5-c)n)$ for an $n$-independent constant $c$.
We also show that for a large fraction of random instances from the $k$-spin model and for any fully satisfiable or slightly frustrated $k$-CSP formula, statement (a) is the case.
arXiv Detail & Related papers (2022-12-03T02:45:23Z) - An Accelerated Stochastic Algorithm for Solving the Optimal Transport
Problem [2.1485350418225244]
A primal-dual accelerated gradient descent with variance reduction algorithm (PDASGD) is proposed to solve linear-constrained optimization problems.
PDASGD enjoys the best-known computational complexity -- $widetildemathcalO(n2/epsilon)$, where $n$ is the number of atoms, and $epsilon>0$ is the accuracy.
arXiv Detail & Related papers (2022-03-02T01:16:10Z) - Time and Query Optimal Quantum Algorithms Based on Decision Trees [2.492300648514128]
We show that a quantum algorithm can be implemented in time $tilde O(sqrtGT)$.
Our algorithm is based on non-binary span programs and their efficient implementation.
arXiv Detail & Related papers (2021-05-18T06:51:11Z) - On Robust Optimal Transport: Computational Complexity, Low-rank
Approximation, and Barycenter Computation [14.80695185915604]
We consider two robust versions of optimal transport, named $textitRobust Semi-constrained Optimal Transport$ (RSOT) and $textitRobust Unconstrained Optimal Transport$ (ROT)
For both problems in the discrete settings, we propose Sinkhorn-based algorithms that produce $varepsilon$-approximations of RSOT and ROT in $widetildemathcalO(fracn2varepsilon)$ time.
arXiv Detail & Related papers (2021-02-13T03:55:52Z) - Streaming Complexity of SVMs [110.63976030971106]
We study the space complexity of solving the bias-regularized SVM problem in the streaming model.
We show that for both problems, for dimensions of $frac1lambdaepsilon$, one can obtain streaming algorithms with spacely smaller than $frac1lambdaepsilon$.
arXiv Detail & Related papers (2020-07-07T17:10:00Z) - Second-order Conditional Gradient Sliding [79.66739383117232]
We present the emphSecond-Order Conditional Gradient Sliding (SOCGS) algorithm.
The SOCGS algorithm converges quadratically in primal gap after a finite number of linearly convergent iterations.
It is useful when the feasible region can only be accessed efficiently through a linear optimization oracle.
arXiv Detail & Related papers (2020-02-20T17:52:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.