Global Riemannian Acceleration in Hyperbolic and Spherical Spaces
- URL: http://arxiv.org/abs/2012.03618v3
- Date: Fri, 29 Jan 2021 12:59:58 GMT
- Title: Global Riemannian Acceleration in Hyperbolic and Spherical Spaces
- Authors: David Mart\'inez-Rubio
- Abstract summary: Research on the acceleration phenomenon on Euclidian curvature by the first global firstorder method.
First order method achieves the same rates as in Euclidean gradient as accelerated gradient descent.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We further research on the acceleration phenomenon on Riemannian manifolds by
introducing the first global first-order method that achieves the same rates as
accelerated gradient descent in the Euclidean space for the optimization of
smooth and geodesically convex (g-convex) or strongly g-convex functions
defined on the hyperbolic space or a subset of the sphere, up to constants and
log factors. To the best of our knowledge, this is the first method that is
proved to achieve these rates globally on functions defined on a Riemannian
manifold $\mathcal{M}$ other than the Euclidean space. As a proxy, we solve a
constrained non-convex Euclidean problem, under a condition between convexity
and quasar-convexity, of independent interest. Additionally, for any Riemannian
manifold of bounded sectional curvature, we provide reductions from
optimization methods for smooth and g-convex functions to methods for smooth
and strongly g-convex functions and vice versa.
Related papers
- Riemannian Bilevel Optimization [35.42472057648458]
We focus in particular on batch and gradient-based methods, with the explicit goal of avoiding second-order information.
We propose and analyze $mathrmRF2SA$, a method that leverages first-order gradient information.
We provide explicit convergence rates for reaching $epsilon$-stationary points under various setups.
arXiv Detail & Related papers (2024-05-22T20:49:01Z) - Decentralized Riemannian Conjugate Gradient Method on the Stiefel
Manifold [59.73080197971106]
This paper presents a first-order conjugate optimization method that converges faster than the steepest descent method.
It aims to achieve global convergence over the Stiefel manifold.
arXiv Detail & Related papers (2023-08-21T08:02:16Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Accelerated Riemannian Optimization: Handling Constraints with a Prox to
Bound Geometric Penalties [20.711789781518753]
We propose a globally-accelerated, first-order method for the optimization of smooth and geodesicallyrated functions.
We achieve the same convergence rates as Nesterov's accelerated descent, up to a multipative compact set.
arXiv Detail & Related papers (2022-11-26T19:28:48Z) - Optimal Extragradient-Based Bilinearly-Coupled Saddle-Point Optimization [116.89941263390769]
We consider the smooth convex-concave bilinearly-coupled saddle-point problem, $min_mathbfxmax_mathbfyF(mathbfx) + H(mathbfx,mathbfy)$, where one has access to first-order oracles for $F$, $G$ as well as the bilinear coupling function $H$.
We present a emphaccelerated gradient-extragradient (AG-EG) descent-ascent algorithm that combines extragrad
arXiv Detail & Related papers (2022-06-17T06:10:20Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - Accelerated Primal-Dual Gradient Method for Smooth and Convex-Concave
Saddle-Point Problems with Bilinear Coupling [84.47780064014262]
We study a linear convex-concave saddle-point problem $min_xmax_y f(x) ytopmathbfA x - g(y)
arXiv Detail & Related papers (2021-12-30T20:31:46Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z) - Stochastic Zeroth-order Riemannian Derivative Estimation and
Optimization [15.78743548731191]
We propose an oracle version of the Gaussian smoothing function to overcome the difficulty of non-linearity of manifold non-linearity.
We demonstrate the applicability of our algorithms by results and real-world applications on black-box stiffness control for robotics and black-box attacks to neural networks.
arXiv Detail & Related papers (2020-03-25T06:58:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.