Sinkhorn Flow: A Continuous-Time Framework for Understanding and
Generalizing the Sinkhorn Algorithm
- URL: http://arxiv.org/abs/2311.16706v1
- Date: Tue, 28 Nov 2023 11:29:12 GMT
- Title: Sinkhorn Flow: A Continuous-Time Framework for Understanding and
Generalizing the Sinkhorn Algorithm
- Authors: Mohammad Reza Karimi, Ya-Ping Hsieh, Andreas Krause
- Abstract summary: We introduce a continuous-time analogue of the Sinkhorn algorithm.
This perspective allows us to derive novel variants of Sinkhorn schemes that are robust to noise and bias.
- Score: 49.45427072226592
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many problems in machine learning can be formulated as solving
entropy-regularized optimal transport on the space of probability measures. The
canonical approach involves the Sinkhorn iterates, renowned for their rich
mathematical properties. Recently, the Sinkhorn algorithm has been recast
within the mirror descent framework, thus benefiting from classical
optimization theory insights. Here, we build upon this result by introducing a
continuous-time analogue of the Sinkhorn algorithm. This perspective allows us
to derive novel variants of Sinkhorn schemes that are robust to noise and bias.
Moreover, our continuous-time dynamics not only generalize but also offer a
unified perspective on several recently discovered dynamics in machine learning
and mathematics, such as the "Wasserstein mirror flow" of (Deb et al. 2023) or
the "mean-field Schr\"odinger equation" of (Claisse et al. 2023).
Related papers
- Bregman-divergence-based Arimoto-Blahut algorithm [53.64687146666141]
We generalize the Arimoto-Blahut algorithm to a general function defined over Bregman-divergence system.
We propose a convex-optimization-free algorithm that can be applied to classical and quantum rate-distortion theory.
arXiv Detail & Related papers (2024-08-10T06:16:24Z) - Already Moderate Population Sizes Provably Yield Strong Robustness to Noise [53.27802701790209]
We show that two evolutionary algorithms can tolerate constant noise probabilities without increasing the runtime on the OneMax benchmark.
Our results are based on a novel proof argument that the noiseless offspring can be seen as a biased uniform crossover between the parent and the noisy offspring.
arXiv Detail & Related papers (2024-04-02T16:35:52Z) - Accelerating Sinkhorn Algorithm with Sparse Newton Iterations [14.094908995798757]
We present Sinkhorn-Newton-Sparse (SNS), an extension to the Sinkhorn algorithm.
SNS converges orders magnitude faster across a wide range of practical cases.
arXiv Detail & Related papers (2024-01-20T21:23:09Z) - Compressed online Sinkhorn [3.2534959204741085]
We revisit the recently introduced online Sinkhorn algorithm of [Mensch and Peyr'e, 2020].
We improve the convergence analysis for the online Sinkhorn algorithm, the new rate that we obtain is faster than the previous rate under certain parameter choices.
Secondly, we propose the compressed online Sinkhorn algorithm which combines measure compression techniques with the online Sinkhorn algorithm.
arXiv Detail & Related papers (2023-10-08T05:33:32Z) - Data Assimilation for Sign-indefinite Priors: A generalization of
Sinkhorn's algorithm [0.0]
We seek to revise sign-indefinite multi-dimensional arrays in a way that the updated values agree with specified marginals.
Our approach follows the rationale in Schr"odinger's problem, aimed at updating a "prior" probability measure to agree with marginal distributions.
arXiv Detail & Related papers (2023-08-22T21:13:39Z) - Simulating Markovian open quantum systems using higher-order series
expansion [1.713291434132985]
We present an efficient quantum algorithm for simulating the dynamics of Markovian open quantum systems.
Our algorithm is conceptually cleaner, and it only uses simple quantum primitives without compressed encoding.
arXiv Detail & Related papers (2022-12-05T06:02:50Z) - Losing momentum in continuous-time stochastic optimisation [42.617042045455506]
momentum-based optimisation algorithms have become particularly widespread.
In this work, we analyse a continuous-time model for gradient descent with momentum.
We also train a convolutional neural network in an image classification problem.
arXiv Detail & Related papers (2022-09-08T10:46:05Z) - A Unified Framework for Implicit Sinkhorn Differentiation [58.56866763433335]
We propose an algorithm that obtains analytical gradients of a Sinkhorn layer via implicit differentiation.
We show that it is computationally more efficient, particularly when resources like GPU memory are scarce.
arXiv Detail & Related papers (2022-05-13T14:45:31Z) - Debiased Sinkhorn barycenters [110.79706180350507]
Entropy regularization in optimal transport (OT) has been the driver of many recent interests for Wasserstein metrics and barycenters in machine learning.
We show how this bias is tightly linked to the reference measure that defines the entropy regularizer.
We propose debiased Wasserstein barycenters that preserve the best of both worlds: fast Sinkhorn-like iterations without entropy smoothing.
arXiv Detail & Related papers (2020-06-03T23:06:02Z) - Forecasting Sequential Data using Consistent Koopman Autoencoders [52.209416711500005]
A new class of physics-based methods related to Koopman theory has been introduced, offering an alternative for processing nonlinear dynamical systems.
We propose a novel Consistent Koopman Autoencoder model which, unlike the majority of existing work, leverages the forward and backward dynamics.
Key to our approach is a new analysis which explores the interplay between consistent dynamics and their associated Koopman operators.
arXiv Detail & Related papers (2020-03-04T18:24:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.