Data Assimilation for Sign-indefinite Priors: A generalization of
Sinkhorn's algorithm
- URL: http://arxiv.org/abs/2308.11791v1
- Date: Tue, 22 Aug 2023 21:13:39 GMT
- Title: Data Assimilation for Sign-indefinite Priors: A generalization of
Sinkhorn's algorithm
- Authors: Anqi Dong, Tryphon T. Georgiou, and Allen Tannenbaum
- Abstract summary: We seek to revise sign-indefinite multi-dimensional arrays in a way that the updated values agree with specified marginals.
Our approach follows the rationale in Schr"odinger's problem, aimed at updating a "prior" probability measure to agree with marginal distributions.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The purpose of this work is to develop a framework to calibrate signed
datasets so as to be consistent with specified marginals by suitably extending
the Schr\"odinger-Fortet-Sinkhorn paradigm. Specifically, we seek to revise
sign-indefinite multi-dimensional arrays in a way that the updated values agree
with specified marginals. Our approach follows the rationale in Schr\"odinger's
problem, aimed at updating a "prior" probability measure to agree with marginal
distributions. The celebrated Sinkhorn's algorithm (established earlier by R.\
Fortet) that solves Schr\"odinger's problem found early applications in
calibrating contingency tables in statistics and, more recently, multi-marginal
problems in machine learning and optimal transport. Herein, we postulate a
sign-indefinite prior in the form of a multi-dimensional array, and propose an
optimization problem to suitably update this prior to ensure consistency with
given marginals. The resulting algorithm generalizes the Sinkhorn algorithm in
that it amounts to iterative scaling of the entries of the array along
different coordinate directions. The scaling is multiplicative but also, in
contrast to Sinkhorn, inverse-multiplicative depending on the sign of the
entries. Our algorithm reduces to the classical Sinkhorn algorithm when the
entries of the prior are positive.
Related papers
- Accelerating Sinkhorn Algorithm with Sparse Newton Iterations [14.094908995798757]
We present Sinkhorn-Newton-Sparse (SNS), an extension to the Sinkhorn algorithm.
SNS converges orders magnitude faster across a wide range of practical cases.
arXiv Detail & Related papers (2024-01-20T21:23:09Z) - Sinkhorn Flow: A Continuous-Time Framework for Understanding and
Generalizing the Sinkhorn Algorithm [49.45427072226592]
We introduce a continuous-time analogue of the Sinkhorn algorithm.
This perspective allows us to derive novel variants of Sinkhorn schemes that are robust to noise and bias.
arXiv Detail & Related papers (2023-11-28T11:29:12Z) - Compressed online Sinkhorn [3.2534959204741085]
We revisit the recently introduced online Sinkhorn algorithm of [Mensch and Peyr'e, 2020].
We improve the convergence analysis for the online Sinkhorn algorithm, the new rate that we obtain is faster than the previous rate under certain parameter choices.
Secondly, we propose the compressed online Sinkhorn algorithm which combines measure compression techniques with the online Sinkhorn algorithm.
arXiv Detail & Related papers (2023-10-08T05:33:32Z) - On Sinkhorn's Algorithm and Choice Modeling [6.43826005042477]
We show that the associated maximum likelihood estimation problems are equivalent to a classic matrix balancing problem with target row and column sums.
This perspective opens doors between two seemingly unrelated research areas.
We draw inspirations from these connections and resolve important open problems on the study of Sinkhorn's algorithm.
arXiv Detail & Related papers (2023-09-30T05:20:23Z) - Non-asymptotic convergence bounds for Sinkhorn iterates and their
gradients: a coupling approach [10.568851068989972]
We focus on a relaxation of the original OT problem, the entropic OT problem, which allows to implement efficient and practical algorithmic solutions.
This formulation, also known as the Schr"odinger Bridge problem, notably connects with Optimal Control (SOC) and can be solved with the popular Sinkhorn algorithm.
arXiv Detail & Related papers (2023-04-13T13:58:25Z) - Optimal Algorithms for the Inhomogeneous Spiked Wigner Model [89.1371983413931]
We derive an approximate message-passing algorithm (AMP) for the inhomogeneous problem.
We identify in particular the existence of a statistical-to-computational gap where known algorithms require a signal-to-noise ratio bigger than the information-theoretic threshold to perform better than random.
arXiv Detail & Related papers (2023-02-13T19:57:17Z) - Rethinking Initialization of the Sinkhorn Algorithm [36.72281550968301]
We show that data-dependent initializers result in dramatic speed-ups, with no effect on differentiability as long as implicit differentiation is used.
Ours rely on closed-forms for exact or approximate OT solutions known in the 1D, Gaussian or GMM settings.
arXiv Detail & Related papers (2022-06-15T16:23:03Z) - Bregman divergence based em algorithm and its application to classical
and quantum rate distortion theory [61.12008553173672]
We address the minimization problem of the Bregman divergence between an exponential subfamily and a mixture subfamily in a Bregman divergence system.
We apply this algorithm to rate distortion and its variants including the quantum setting.
arXiv Detail & Related papers (2022-01-07T13:33:28Z) - Direct Measure Matching for Crowd Counting [59.66286603624411]
We propose a new measure-based counting approach to regress the predicted density maps to the scattered point-annotated ground truth directly.
In this paper, we derive a semi-balanced form of Sinkhorn divergence, based on which a Sinkhorn counting loss is designed for measure matching.
arXiv Detail & Related papers (2021-07-04T06:37:33Z) - Spectral clustering under degree heterogeneity: a case for the random
walk Laplacian [83.79286663107845]
This paper shows that graph spectral embedding using the random walk Laplacian produces vector representations which are completely corrected for node degree.
In the special case of a degree-corrected block model, the embedding concentrates about K distinct points, representing communities.
arXiv Detail & Related papers (2021-05-03T16:36:27Z) - Learning the Positions in CountSketch [51.15935547615698]
We consider sketching algorithms which first compress data by multiplication with a random sketch matrix, and then apply the sketch to quickly solve an optimization problem.
In this work we propose the first learning algorithm that also optimize the locations of the non-zero entries.
We show this algorithm gives better accuracy for low rank approximation than previous work, and apply it to other problems such as $k$-means clustering for the first time.
arXiv Detail & Related papers (2020-07-20T05:06:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.