From Nesterov's Estimate Sequence to Riemannian Acceleration
- URL: http://arxiv.org/abs/2001.08876v1
- Date: Fri, 24 Jan 2020 04:17:14 GMT
- Title: From Nesterov's Estimate Sequence to Riemannian Acceleration
- Authors: Kwangjun Ahn and Suvrit Sra
- Abstract summary: We develop an alternative analysis for Nesterov's estimate sequence technique that may also be of independent interest.
Then, we extend this analysis to the Riemannian setting, localizing the key difficulty due to non-Euclidean structure into a certain metric distortion''
- Score: 52.99237600875452
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose the first global accelerated gradient method for Riemannian
manifolds. Toward establishing our result we revisit Nesterov's estimate
sequence technique and develop an alternative analysis for it that may also be
of independent interest. Then, we extend this analysis to the Riemannian
setting, localizing the key difficulty due to non-Euclidean structure into a
certain ``metric distortion.'' We control this distortion by developing a novel
geometric inequality, which permits us to propose and analyze a Riemannian
counterpart to Nesterov's accelerated gradient method.
Related papers
- Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Riemannian accelerated gradient methods via extrapolation [40.23168342389821]
We show when the iterates are generated from the gradient descent method, the accelerated scheme achieves the optimal convergence rateally.
Our experiments verify the practical benefit of the novel acceleration strategy.
arXiv Detail & Related papers (2022-08-13T10:31:09Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - Differentially private Riemannian optimization [40.23168342389821]
We introduce a framework of differentially private.
Ritual risk problem where the.
parameter is constrained to a.
Rockefellerian manifold.
We show the efficacy of the proposed framework in several applications.
arXiv Detail & Related papers (2022-05-19T12:04:15Z) - A Riemannian Accelerated Proximal Extragradient Framework and its
Implications [52.31775617527208]
We revisit the emphAccelerated Hybrid Proximal Extragradient (A-HPE) method of citetmonteiro2013accelerated, a powerful framework for obtaining accelerated Euclidean methods.
arXiv Detail & Related papers (2021-11-04T11:32:20Z) - A diffusion-map-based algorithm for gradient computation on manifolds
and applications [0.0]
We recover the gradient of a given function defined on interior points of a Riemannian submanifold in the Euclidean space.
This approach is based on the estimates of the Laplace-Beltrami operator proposed in the diffusion-maps theory.
arXiv Detail & Related papers (2021-08-16T09:35:22Z) - On the Convergence of Stochastic Extragradient for Bilinear Games with
Restarted Iteration Averaging [96.13485146617322]
We present an analysis of the ExtraGradient (SEG) method with constant step size, and present variations of the method that yield favorable convergence.
We prove that when augmented with averaging, SEG provably converges to the Nash equilibrium, and such a rate is provably accelerated by incorporating a scheduled restarting procedure.
arXiv Detail & Related papers (2021-06-30T17:51:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.