Nonconvex Factorization and Manifold Formulations are Almost Equivalent
in Low-rank Matrix Optimization
- URL: http://arxiv.org/abs/2108.01772v1
- Date: Tue, 3 Aug 2021 22:14:01 GMT
- Title: Nonconvex Factorization and Manifold Formulations are Almost Equivalent
in Low-rank Matrix Optimization
- Authors: Yuetian Luo and Xudong Li and Anru R. Zhang
- Abstract summary: We consider the geometric landscape connection of the widely studied manifold and factorization formulations in low-rank positive semidefinite (PSD) and general matrix optimization.
We establish an equivalence on the set of first-order stationary points (FOSPs) and second-order stationary points (SOSPs) between the manifold and the factorization formulations.
Similarities and differences on the landscape connection under the PSD case and the general case are discussed.
- Score: 6.462179538647346
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In this paper, we consider the geometric landscape connection of the widely
studied manifold and factorization formulations in low-rank positive
semidefinite (PSD) and general matrix optimization. We establish an equivalence
on the set of first-order stationary points (FOSPs) and second-order stationary
points (SOSPs) between the manifold and the factorization formulations. We
further give a sandwich inequality on the spectrum of Riemannian and Euclidean
Hessians at FOSPs, which can be used to transfer more geometric properties from
one formulation to another. Similarities and differences on the landscape
connection under the PSD case and the general case are discussed. To the best
of our knowledge, this is the first geometric landscape connection between the
manifold and the factorization formulations for handling rank constraints. In
the general low-rank matrix optimization, the landscape connection of two
factorization formulations (unregularized and regularized ones) is also
provided. By applying these geometric landscape connections, we are able to
solve unanswered questions in literature and establish stronger results in the
applications on geometric analysis of phase retrieval, well-conditioned
low-rank matrix optimization, and the role of regularization in factorization
arising from machine learning and signal processing.
Related papers
- Regularized Projection Matrix Approximation with Applications to Community Detection [1.3761665705201904]
This paper introduces a regularized projection matrix approximation framework designed to recover cluster information from the affinity matrix.
We investigate three distinct penalty functions, each specifically tailored to address bounded, positive, and sparse scenarios.
Numerical experiments conducted on both synthetic and real-world datasets reveal that our regularized projection matrix approximation approach significantly outperforms state-of-the-art methods in clustering performance.
arXiv Detail & Related papers (2024-05-26T15:18:22Z) - Synergistic eigenanalysis of covariance and Hessian matrices for enhanced binary classification [72.77513633290056]
We present a novel approach that combines the eigenanalysis of a covariance matrix evaluated on a training set with a Hessian matrix evaluated on a deep learning model.
Our method captures intricate patterns and relationships, enhancing classification performance.
arXiv Detail & Related papers (2024-02-14T16:10:42Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Geometric Methods for Sampling, Optimisation, Inference and Adaptive
Agents [102.42623636238399]
We identify fundamental geometric structures that underlie the problems of sampling, optimisation, inference and adaptive decision-making.
We derive algorithms that exploit these geometric structures to solve these problems efficiently.
arXiv Detail & Related papers (2022-03-20T16:23:17Z) - Relative Pose from SIFT Features [50.81749304115036]
We derive a new linear constraint relating the unknown elements of the fundamental matrix and the orientation and scale.
The proposed constraint is tested on a number of problems in a synthetic environment and on publicly available real-world datasets on more than 80000 image pairs.
arXiv Detail & Related papers (2022-03-15T14:16:39Z) - On Geometric Connections of Embedded and Quotient Geometries in
Riemannian Fixed-rank Matrix Optimization [5.876141028192136]
This paper proposes a general procedure for establishing the geometric landscape connections of a Riemannian optimization problem under the embedded and quotient geometries.
We observe an algorithmic connection between two geometries with some specific Riemannian metrics in fixed-rank matrix optimization.
Results provide a few new theoretical insights to unanswered questions in the literature.
arXiv Detail & Related papers (2021-10-23T03:13:56Z) - Optimization and Sampling Under Continuous Symmetry: Examples and Lie
Theory [26.555110725656963]
We show examples of Lieant's theorem, Lie groups, Lie algebras, and the Harish-Chandra--Itzyintegrals formulas.
We then present an introduction to optimization theory -- an indispensable mathematical toolkit for capturing continuous symmetries.
arXiv Detail & Related papers (2021-09-02T16:44:44Z) - Joint Network Topology Inference via Structured Fusion Regularization [70.30364652829164]
Joint network topology inference represents a canonical problem of learning multiple graph Laplacian matrices from heterogeneous graph signals.
We propose a general graph estimator based on a novel structured fusion regularization.
We show that the proposed graph estimator enjoys both high computational efficiency and rigorous theoretical guarantee.
arXiv Detail & Related papers (2021-03-05T04:42:32Z) - Symmetric Positive Semi-definite Riemannian Geometry with Application to
Domain Adaptation [7.126737403006778]
We present new results on the geometry of symmetric positive semi-definite (SPSD) matrices.
We propose an algorithm for Domain Adaptation (DA) and demonstrate its performance in two applications: fusion of hyper-spectral images and motion identification.
arXiv Detail & Related papers (2020-07-28T14:39:36Z) - Nonconvex Matrix Completion with Linearly Parameterized Factors [10.163102766021373]
Parametric Factorization holds for important examples including subspace and completion simulations.
The effectiveness of our unified nonconstrained matrix optimization method is also illustrated.
arXiv Detail & Related papers (2020-03-29T22:40:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.