A polynomial time iterative algorithm for matching Gaussian matrices
with non-vanishing correlation
- URL: http://arxiv.org/abs/2212.13677v1
- Date: Wed, 28 Dec 2022 03:30:47 GMT
- Title: A polynomial time iterative algorithm for matching Gaussian matrices
with non-vanishing correlation
- Authors: Jian Ding, Zhangsong Li
- Abstract summary: We propose an iterative matching algorithm, which succeeds in time as long as the correlation between the two Gaussian matrices does not vanish.
Our result is the first algorithm that solves the graph matching type of problem when the correlation is an arbitrarily small constant.
- Score: 7.343886246061388
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Motivated by the problem of matching vertices in two correlated
Erd\H{o}s-R\'enyi graphs, we study the problem of matching two correlated
Gaussian Wigner matrices. We propose an iterative matching algorithm, which
succeeds in polynomial time as long as the correlation between the two Gaussian
matrices does not vanish. Our result is the first polynomial time algorithm
that solves a graph matching type of problem when the correlation is an
arbitrarily small constant.
Related papers
- An Efficient Algorithm for Clustered Multi-Task Compressive Sensing [60.70532293880842]
Clustered multi-task compressive sensing is a hierarchical model that solves multiple compressive sensing tasks.
The existing inference algorithm for this model is computationally expensive and does not scale well in high dimensions.
We propose a new algorithm that substantially accelerates model inference by avoiding the need to explicitly compute these covariance matrices.
arXiv Detail & Related papers (2023-09-30T15:57:14Z) - On Sinkhorn's Algorithm and Choice Modeling [6.43826005042477]
We show that the associated maximum likelihood estimation problems are equivalent to a classic matrix balancing problem with target row and column sums.
This perspective opens doors between two seemingly unrelated research areas.
We draw inspirations from these connections and resolve important open problems on the study of Sinkhorn's algorithm.
arXiv Detail & Related papers (2023-09-30T05:20:23Z) - A polynomial-time iterative algorithm for random graph matching with
non-vanishing correlation [12.869436542291144]
We propose an efficient algorithm for matching two correlated ErdHos--R'enyi graphs with $n$ whose edges are correlated through a latent correspondence.
Our algorithm has running time and succeeds to recover the latent matching as long as the edge correlation is non-vanishing.
arXiv Detail & Related papers (2023-06-01T00:58:50Z) - Efficient Algorithms for Exact Graph Matching on Correlated Stochastic
Block Models with Constant Correlation [7.914348940034351]
We propose an efficient algorithm for matching graphs with community structure.
Our algorithm is the first low-order-time algorithm achieving exact matching between two correlated block models.
arXiv Detail & Related papers (2023-05-31T09:06:50Z) - Kernelized multi-graph matching [0.0]
We introduce a novel kernelized multigraph matching technique that handles both the attributes of the pair and the edges of the graphs.
We propose several projectors leading to improved stability of the results.
arXiv Detail & Related papers (2022-10-11T07:22:47Z) - Graph Learning from Multivariate Dependent Time Series via a
Multi-Attribute Formulation [12.843340232167266]
We consider the problem of inferring the conditional independence graph (CIG) of a stationary time series.
In a time series graph, each component of the vector series is represented by distinct node, and associations between components are represented by edges between the corresponding nodes.
We formulate the problem as one of multi-attribute graph estimation for random vectors where a vector is associated with each node of the graph.
arXiv Detail & Related papers (2022-04-29T00:17:52Z) - High-Dimensional Sparse Bayesian Learning without Covariance Matrices [66.60078365202867]
We introduce a new inference scheme that avoids explicit construction of the covariance matrix.
Our approach couples a little-known diagonal estimation result from numerical linear algebra with the conjugate gradient algorithm.
On several simulations, our method scales better than existing approaches in computation time and memory.
arXiv Detail & Related papers (2022-02-25T16:35:26Z) - Sublinear Time Approximation of Text Similarity Matrices [50.73398637380375]
We introduce a generalization of the popular Nystr"om method to the indefinite setting.
Our algorithm can be applied to any similarity matrix and runs in sublinear time in the size of the matrix.
We show that our method, along with a simple variant of CUR decomposition, performs very well in approximating a variety of similarity matrices.
arXiv Detail & Related papers (2021-12-17T17:04:34Z) - Linear Bandit Algorithms with Sublinear Time Complexity [67.21046514005029]
We propose to accelerate existing linear bandit algorithms to achieve per-step time complexity sublinear in the number of arms $K$.
We show that our proposed algorithms can achieve $O(K1-alpha(T))$ per-step complexity for some $alpha(T) > 0$ and $widetilde O(stT)$ regret, where $T$ is the time horizon.
arXiv Detail & Related papers (2021-03-03T22:42:15Z) - Robustly Learning Mixtures of $k$ Arbitrary Gaussians [47.40835932474677]
We give a-time algorithm for the problem of robustly estimating a mixture of $k$ arbitrary Gaussians in $mathbbRd$, for any fixed $k$, in the presence of a constant fraction of arbitrary corruptions.
Our main tools are an efficient emphpartial clustering algorithm that relies on the sum-of-squares method, and a novel tensor decomposition algorithm that allows errors in both Frobenius norm and low-rank terms.
arXiv Detail & Related papers (2020-12-03T17:54:03Z) - Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform [64.90148466525754]
We study the performance of iterative sketching for least-squares problems.
We show that the convergence rate for Haar and randomized Hadamard matrices are identical, andally improve upon random projections.
These techniques may be applied to other algorithms that employ randomized dimension reduction.
arXiv Detail & Related papers (2020-02-03T16:17:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.