Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds
- URL: http://arxiv.org/abs/2306.16617v1
- Date: Thu, 29 Jun 2023 01:20:44 GMT
- Title: Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds
- Authors: Yang Cai, Michael I. Jordan, Tianyi Lin, Argyris Oikonomou,
Emmanouil-Vasileios Vlatakis-Gkaragkounis
- Abstract summary: We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
- Score: 77.4346324549323
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Numerous applications in machine learning and data analytics can be
formulated as equilibrium computation over Riemannian manifolds. Despite the
extensive investigation of their Euclidean counterparts, the performance of
Riemannian gradient-based algorithms remain opaque and poorly understood. We
revisit the original scheme of Riemannian gradient descent (RGD) and analyze it
under a geodesic monotonicity assumption, which includes the well-studied
geodesically convex-concave min-max optimization problem as a special case. Our
main contribution is to show that, despite the phenomenon of distance
distortion, the RGD scheme, with a step size that is agnostic to the manifold's
curvature, achieves a curvature-independent and linear last-iterate convergence
rate in the geodesically strongly monotone setting. To the best of our
knowledge, the possibility of curvature-independent rates and/or last-iterate
convergence in the Riemannian setting has not been considered before.
Related papers
- Riemannian Federated Learning via Averaging Gradient Stream [8.75592575216789]
This paper develops and analyzes an efficient Federated Averaging Gradient Stream (RFedAGS) algorithm.
Numerical simulations conducted on synthetic and real-world data demonstrate the performance of the proposed RFedAGS.
arXiv Detail & Related papers (2024-09-11T12:28:42Z) - Convergence and Complexity Guarantee for Inexact First-order Riemannian Optimization Algorithms [18.425648833592312]
We show that tBMM converges to an $ilon$-stationary point within $O(epsilon-2)$.
Under a mild iteration, the results still hold when the subproblem is solved in iterations in each provided the total optimality gap is bounded.
arXiv Detail & Related papers (2024-05-05T22:53:14Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Riemannian stochastic optimization methods avoid strict saddle points [68.80251170757647]
We show that policies under study avoid strict saddle points / submanifolds with probability 1.
This result provides an important sanity check as it shows that, almost always, the limit state of an algorithm can only be a local minimizer.
arXiv Detail & Related papers (2023-11-04T11:12:24Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - Differentially private Riemannian optimization [40.23168342389821]
We introduce a framework of differentially private.
Ritual risk problem where the.
parameter is constrained to a.
Rockefellerian manifold.
We show the efficacy of the proposed framework in several applications.
arXiv Detail & Related papers (2022-05-19T12:04:15Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Stochastic Zeroth-order Riemannian Derivative Estimation and
Optimization [15.78743548731191]
We propose an oracle version of the Gaussian smoothing function to overcome the difficulty of non-linearity of manifold non-linearity.
We demonstrate the applicability of our algorithms by results and real-world applications on black-box stiffness control for robotics and black-box attacks to neural networks.
arXiv Detail & Related papers (2020-03-25T06:58:19Z) - From Nesterov's Estimate Sequence to Riemannian Acceleration [52.99237600875452]
We develop an alternative analysis for Nesterov's estimate sequence technique that may also be of independent interest.
Then, we extend this analysis to the Riemannian setting, localizing the key difficulty due to non-Euclidean structure into a certain metric distortion''
arXiv Detail & Related papers (2020-01-24T04:17:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.