Curvature-Dependant Global Convergence Rates for Optimization on
Manifolds of Bounded Geometry
- URL: http://arxiv.org/abs/2008.02517v1
- Date: Thu, 6 Aug 2020 08:30:35 GMT
- Title: Curvature-Dependant Global Convergence Rates for Optimization on
Manifolds of Bounded Geometry
- Authors: Mario Lezcano-Casado
- Abstract summary: We give curvature-dependant convergence rates for weakly convex functions defined on a manifold of 1-bounded geometry.
We compute these bounds explicitly for some manifold commonly used in the optimization literature.
We present self-contained proofs of fully general bounds on the norm of the differential of the exponential map.
- Score: 6.85316573653194
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: We give curvature-dependant convergence rates for the optimization of weakly
convex functions defined on a manifold of 1-bounded geometry via Riemannian
gradient descent and via the dynamic trivialization algorithm. In order to do
this, we give a tighter bound on the norm of the Hessian of the Riemannian
exponential than the previously known. We compute these bounds explicitly for
some manifolds commonly used in the optimization literature such as the special
orthogonal group and the real Grassmannian. Along the way, we present
self-contained proofs of fully general bounds on the norm of the differential
of the exponential map and certain cosine inequalities on manifolds, which are
commonly used in optimization on manifolds.
Related papers
- Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Decentralized Riemannian Conjugate Gradient Method on the Stiefel
Manifold [59.73080197971106]
This paper presents a first-order conjugate optimization method that converges faster than the steepest descent method.
It aims to achieve global convergence over the Stiefel manifold.
arXiv Detail & Related papers (2023-08-21T08:02:16Z) - Warped geometric information on the optimisation of Euclidean functions [43.43598316339732]
We consider optimisation of a real-valued function defined in a potentially high-dimensional Euclidean space.
We find the function's optimum along a manifold with a warped metric.
Our proposed algorithm, using 3rd-order approximation of geodesics, tends to outperform standard Euclidean gradient-based counterparts.
arXiv Detail & Related papers (2023-08-16T12:08:50Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - Generalization in Supervised Learning Through Riemannian Contraction [4.3604518673788135]
In a supervised learning setting, we show that if an metric 0 is contracting in someian rate $lambda, it is uniformly uniformly rate with $math.
The results hold for gradient and deterministic loss surfaces, in both continuous and stable $-time.
They can be shown to be optimal in certain linear settings, such as over Descent$ convex or strongly convex loss surfaces.
arXiv Detail & Related papers (2022-01-17T23:08:47Z) - Global Riemannian Acceleration in Hyperbolic and Spherical Spaces [0.0]
Research on the acceleration phenomenon on Euclidian curvature by the first global firstorder method.
First order method achieves the same rates as in Euclidean gradient as accelerated gradient descent.
arXiv Detail & Related papers (2020-12-07T12:09:30Z) - Random extrapolation for primal-dual coordinate descent [61.55967255151027]
We introduce a randomly extrapolated primal-dual coordinate descent method that adapts to sparsity of the data matrix and the favorable structures of the objective function.
We show almost sure convergence of the sequence and optimal sublinear convergence rates for the primal-dual gap and objective values, in the general convex-concave case.
arXiv Detail & Related papers (2020-07-13T17:39:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.