On the Local Linear Rate of Consensus on the Stiefel Manifold
- URL: http://arxiv.org/abs/2101.09346v1
- Date: Fri, 22 Jan 2021 21:52:38 GMT
- Title: On the Local Linear Rate of Consensus on the Stiefel Manifold
- Authors: Shixiang Chen, Alfredo Garcia, Mingyi Hong, Shahin Shahrampour
- Abstract summary: We restrict the convergence of properties of the Stfelian gradient over the Stiefel problem (for an un connected problem)
The main challenges include (i) developing a technical for analysis, and (ii) to identify the conditions (e.g., suitable step-size and under which the always stays in the local region) under which the always stays in the local region.
- Score: 39.750623187256735
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We study the convergence properties of Riemannian gradient method for solving
the consensus problem (for an undirected connected graph) over the Stiefel
manifold. The Stiefel manifold is a non-convex set and the standard notion of
averaging in the Euclidean space does not work for this problem. We propose
Distributed Riemannian Consensus on Stiefel Manifold (DRCS) and prove that it
enjoys a local linear convergence rate to global consensus. More importantly,
this local rate asymptotically scales with the second largest singular value of
the communication matrix, which is on par with the well-known rate in the
Euclidean space. To the best of our knowledge, this is the first work showing
the equality of the two rates. The main technical challenges include (i)
developing a Riemannian restricted secant inequality for convergence analysis,
and (ii) to identify the conditions (e.g., suitable step-size and
initialization) under which the algorithm always stays in the local region.
Related papers
- Extragradient Type Methods for Riemannian Variational Inequality Problems [25.574847201669144]
We show that the average-rate convergence of both REG and RPEG is $Oleft(frac1Tright)$, aligning with in the 2020ean case.
Results are enabled by addressing judiciously the holonomy effect so additional observations can be reduced.
arXiv Detail & Related papers (2023-09-25T14:08:02Z) - Decentralized Riemannian Conjugate Gradient Method on the Stiefel
Manifold [59.73080197971106]
This paper presents a first-order conjugate optimization method that converges faster than the steepest descent method.
It aims to achieve global convergence over the Stiefel manifold.
arXiv Detail & Related papers (2023-08-21T08:02:16Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Can Decentralized Stochastic Minimax Optimization Algorithms Converge
Linearly for Finite-Sum Nonconvex-Nonconcave Problems? [56.62372517641597]
Decentralized minimax optimization has been actively studied in the past few years due to its application in a wide range machine learning.
This paper develops two novel decentralized minimax optimization algorithms for the non-strongly-nonconcave problem.
arXiv Detail & Related papers (2023-04-24T02:19:39Z) - Stochastic Gradient Descent-Ascent and Consensus Optimization for Smooth
Games: Convergence Analysis under Expected Co-coercivity [49.66890309455787]
We introduce the expected co-coercivity condition, explain its benefits, and provide the first last-iterate convergence guarantees of SGDA and SCO.
We prove linear convergence of both methods to a neighborhood of the solution when they use constant step-size.
Our convergence guarantees hold under the arbitrary sampling paradigm, and we give insights into the complexity of minibatching.
arXiv Detail & Related papers (2021-06-30T18:32:46Z) - Averaging on the Bures-Wasserstein manifold: dimension-free convergence
of gradient descent [15.136397170510834]
We prove new geodesic convexity results which provide stronger control of the iterates, a free convergence.
Our techniques also enable the analysis of two related notions of averaging, the entropically-regularized barycenter and the geometric median.
arXiv Detail & Related papers (2021-06-16T01:05:19Z) - Decentralized Riemannian Gradient Descent on the Stiefel Manifold [39.750623187256735]
We consider a distributed non-sensian optimization where a network of agents aims at minimizing a global function over the Stiefel manifold.
To have exact with constant use, we also propose a decentralized gradient (DRA) for the Stiefel manifold.
arXiv Detail & Related papers (2021-02-14T07:30:23Z) - The Convergence Indicator: Improved and completely characterized
parameter bounds for actual convergence of Particle Swarm Optimization [68.8204255655161]
We introduce a new convergence indicator that can be used to calculate whether the particles will finally converge to a single point or diverge.
Using this convergence indicator we provide the actual bounds completely characterizing parameter regions that lead to a converging swarm.
arXiv Detail & Related papers (2020-06-06T19:08:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.