On Geometric Connections of Embedded and Quotient Geometries in
Riemannian Fixed-rank Matrix Optimization
- URL: http://arxiv.org/abs/2110.12121v2
- Date: Tue, 11 Apr 2023 01:04:00 GMT
- Title: On Geometric Connections of Embedded and Quotient Geometries in
Riemannian Fixed-rank Matrix Optimization
- Authors: Yuetian Luo and Xudong Li and Anru R. Zhang
- Abstract summary: This paper proposes a general procedure for establishing the geometric landscape connections of a Riemannian optimization problem under the embedded and quotient geometries.
We observe an algorithmic connection between two geometries with some specific Riemannian metrics in fixed-rank matrix optimization.
Results provide a few new theoretical insights to unanswered questions in the literature.
- Score: 5.876141028192136
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In this paper, we propose a general procedure for establishing the geometric
landscape connections of a Riemannian optimization problem under the embedded
and quotient geometries. By applying the general procedure to the fixed-rank
positive semidefinite (PSD) and general matrix optimization, we establish an
exact Riemannian gradient connection under two geometries at every point on the
manifold and sandwich inequalities between the spectra of Riemannian Hessians
at Riemannian first-order stationary points (FOSPs). These results immediately
imply an equivalence on the sets of Riemannian FOSPs, Riemannian second-order
stationary points (SOSPs), and strict saddles of fixed-rank matrix optimization
under the embedded and the quotient geometries. To the best of our knowledge,
this is the first geometric landscape connection between the embedded and the
quotient geometries for fixed-rank matrix optimization and it provides a
concrete example of how these two geometries are connected in Riemannian
optimization. In addition, the effects of the Riemannian metric and quotient
structure on the landscape connection are discussed. We also observe an
algorithmic connection between two geometries with some specific Riemannian
metrics in fixed-rank matrix optimization: there is an equivalence between
gradient flows under two geometries with shared spectra of Riemannian Hessians.
A number of novel ideas and technical ingredients including a unified treatment
for different Riemannian metrics, novel metrics for the Stiefel manifold, and
new horizontal space representations under quotient geometries are developed to
obtain our results. The results in this paper deepen our understanding of
geometric and algorithmic connections of Riemannian optimization under
different Riemannian geometries and provide a few new theoretical insights to
unanswered questions in the literature.
Related papers
- RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - FORML: A Riemannian Hessian-free Method for Meta-learning on Stiefel Manifolds [4.757859522106933]
This paper introduces a Hessian-free approach that uses a first-order approximation of derivatives on the Stiefel manifold.
Our method significantly reduces the computational load and memory footprint.
arXiv Detail & Related papers (2024-02-28T10:57:30Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - A Survey of Geometric Optimization for Deep Learning: From Euclidean
Space to Riemannian Manifold [7.737713458418288]
Deep Learning (DL) has achieved success in complex Artificial Intelligence (AI) tasks, but it suffers from various notorious problems.
This article presents a comprehensive survey of applying geometric optimization in DL.
It investigates the application of geometric optimization in different DL networks in various AI tasks, e.g., convolution neural network, recurrent neural network, transfer learning, and optimal transport.
arXiv Detail & Related papers (2023-02-16T10:50:15Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - First-Order Algorithms for Min-Max Optimization in Geodesic Metric
Spaces [93.35384756718868]
min-max algorithms have been analyzed in the Euclidean setting.
We prove that the extraiteient (RCEG) method corrected lastrate convergence at a linear rate.
arXiv Detail & Related papers (2022-06-04T18:53:44Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - Nonconvex Factorization and Manifold Formulations are Almost Equivalent in Low-rank Matrix Optimization [8.59387261480044]
We consider the geometric landscape connection of the widely studied manifold and factorization formulations in low-rank positive semidefinite (PSD) and general matrix optimization.
We show the sandwich relation can be used to transfer more quantitative geometric properties from one formulation to another.
arXiv Detail & Related papers (2021-08-03T22:14:01Z) - Nested Grassmannians for Dimensionality Reduction with Applications [7.106986689736826]
We propose a novel framework for constructing a nested sequence of homogeneous Riemannian manifold.
We focus on applying the proposed framework to the Grassmann manifold, giving rise to the nested Grassmannians (NG)
Specifically, each planar (2D) shape can be represented as a point in the complex projective space which is a complex Grass-mann manifold.
With the proposed NG structure, we develop algorithms for the supervised and unsupervised dimensionality reduction problems respectively.
arXiv Detail & Related papers (2020-10-27T20:09:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.