A Riemannian smoothing steepest descent method for non-Lipschitz
optimization on submanifolds
- URL: http://arxiv.org/abs/2104.04199v1
- Date: Fri, 9 Apr 2021 05:38:28 GMT
- Title: A Riemannian smoothing steepest descent method for non-Lipschitz
optimization on submanifolds
- Authors: Chao Zhang, Xiaojun Chen, Shiqian Ma
- Abstract summary: A smoothing steepest descent method is proposed to minimize a non-Lipschitz function on submanifolds.
We prove any point of the sequence generated by the smoothing function is a limiting point of the original non-Lipschitz problem.
Numerical experiments are conducted to demonstrate the efficiency of the proposed method.
- Score: 15.994495285950293
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose a Riemannian smoothing steepest descent method to
minimize a nonconvex and non-Lipschitz function on submanifolds. The
generalized subdifferentials on Riemannian manifold and the Riemannian gradient
sub-consistency are defined and discussed. We prove that any accumulation point
of the sequence generated by the Riemannian smoothing steepest descent method
is a stationary point associated with the smoothing function employed in the
method, which is necessary for the local optimality of the original
non-Lipschitz problem. Under the Riemannian gradient sub-consistency condition,
we also prove that any accumulation point is a Riemannian limiting stationary
point of the original non-Lipschitz problem. Numerical experiments are
conducted to demonstrate the efficiency of the proposed method.
Related papers
- Stochastic Zeroth-Order Optimization under Strongly Convexity and Lipschitz Hessian: Minimax Sample Complexity [59.75300530380427]
We consider the problem of optimizing second-order smooth and strongly convex functions where the algorithm is only accessible to noisy evaluations of the objective function it queries.
We provide the first tight characterization for the rate of the minimax simple regret by developing matching upper and lower bounds.
arXiv Detail & Related papers (2024-06-28T02:56:22Z) - Riemannian stochastic optimization methods avoid strict saddle points [68.80251170757647]
We show that policies under study avoid strict saddle points / submanifolds with probability 1.
This result provides an important sanity check as it shows that, almost always, the limit state of an algorithm can only be a local minimizer.
arXiv Detail & Related papers (2023-11-04T11:12:24Z) - Decentralized Riemannian Conjugate Gradient Method on the Stiefel
Manifold [59.73080197971106]
This paper presents a first-order conjugate optimization method that converges faster than the steepest descent method.
It aims to achieve global convergence over the Stiefel manifold.
arXiv Detail & Related papers (2023-08-21T08:02:16Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Riemannian Natural Gradient Methods [21.14740680011498]
We introduce the notion of Fisher information matrix in the manifold setting, which can be viewed as a natural extension of the natural gradient method.
We establish the almost-sure global convergence of our proposed method under standard assumptions.
Numerical experiments on applications arising from machine learning demonstrate the advantages of the proposed method over state-of-the-art ones.
arXiv Detail & Related papers (2022-07-15T04:33:10Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - Differentially private Riemannian optimization [40.23168342389821]
We introduce a framework of differentially private.
Ritual risk problem where the.
parameter is constrained to a.
Rockefellerian manifold.
We show the efficacy of the proposed framework in several applications.
arXiv Detail & Related papers (2022-05-19T12:04:15Z) - Stochastic Zeroth-order Riemannian Derivative Estimation and
Optimization [15.78743548731191]
We propose an oracle version of the Gaussian smoothing function to overcome the difficulty of non-linearity of manifold non-linearity.
We demonstrate the applicability of our algorithms by results and real-world applications on black-box stiffness control for robotics and black-box attacks to neural networks.
arXiv Detail & Related papers (2020-03-25T06:58:19Z) - Support recovery and sup-norm convergence rates for sparse pivotal
estimation [79.13844065776928]
In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level.
We show minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators.
arXiv Detail & Related papers (2020-01-15T16:11:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.