Geometric design of the tangent term in landing algorithms for orthogonality constraints
- URL: http://arxiv.org/abs/2507.15638v1
- Date: Mon, 21 Jul 2025 14:00:51 GMT
- Title: Geometric design of the tangent term in landing algorithms for orthogonality constraints
- Authors: Florentin Goyens, P. -A. Absil, Florian Feppon,
- Abstract summary: Family of metrics we propose is a natural extension of the $beta$-metric, defined on the Stiefel manifold.<n>Family of metrics we propose is a natural extension of the $beta$-metric, defined on the Stiefel manifold.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a family a metrics over the set of full-rank $n\times p$ real matrices, and apply them to the landing framework for optimization under orthogonality constraints. The family of metrics we propose is a natural extension of the $\beta$-metric, defined on the Stiefel manifold.
Related papers
- RiemanLine: Riemannian Manifold Representation of 3D Lines for Factor Graph Optimization [49.83974390433746]
This paper introduces textbfRiemanLine, a unified minimal representation for 3D lines.<n>Our key idea is to decouple each line landmark into global and local components.<n>Experiments on ICL-NUIM, TartanAir, and synthetic benchmarks demonstrate that our method achieves significantly more accurate pose estimation and line reconstruction.
arXiv Detail & Related papers (2025-08-06T11:27:38Z) - Riemannian Optimization for Distance Geometry: A Study of Convergence, Robustness, and Incoherence [6.422262171968397]
The Euclidean Distance Geometry (EDG) problem arises in a broad range of applications, including sensor network localization, molecular conformation, and manifold learning.<n>In this paper, we propose a framework for solving the EDG problem by formulating it as a low-rank matrix completion task over the space of positive semi-definite Gram matrices.<n>The available distance measurements are encoded as expansion coefficients in a non-orthogonal basis, and optimization over the Gram matrix implicitly enforces geometric consistency through the triangle inequality.
arXiv Detail & Related papers (2025-07-31T18:40:42Z) - Generalized Gradient Norm Clipping & Non-Euclidean $(L_0,L_1)$-Smoothness [51.302674884611335]
This work introduces a hybrid non-Euclidean optimization method which generalizes norm clipping by combining steepest descent and conditional gradient approaches.<n>We discuss how to instantiate the algorithms for deep learning and demonstrate their properties on image classification and language modeling.
arXiv Detail & Related papers (2025-06-02T17:34:29Z) - On lower bounds of the density of planar periodic sets without unit distances [55.2480439325792]
We introduce a novel approach to estimating $m_1(mathbbR2)$ by reformulating the problem as a Maximal Independent Set (MIS) problem on graphs constructed from flat torus.<n>Our experimental results, supported by theoretical justifications of proposed method, demonstrate that for a sufficiently wide range of parameters this approach does not improve the known lower bound.
arXiv Detail & Related papers (2024-11-20T12:07:19Z) - Structured Regularization for Constrained Optimization on the SPD Manifold [1.1126342180866644]
We introduce a class of structured regularizers, based on symmetric gauge functions, which allow for solving constrained optimization on the SPD manifold with faster unconstrained methods.
We show that our structured regularizers can be chosen to preserve or induce desirable structure, in particular convexity and "difference of convex" structure.
arXiv Detail & Related papers (2024-10-12T22:11:22Z) - Product Geometries on Cholesky Manifolds with Applications to SPD Manifolds [65.04845593770727]
We present two new metrics on the Symmetric Positive Definite (SPD) manifold via the Cholesky manifold.
Our metrics are easy to use, computationally efficient, and numerically stable.
arXiv Detail & Related papers (2024-07-02T18:46:13Z) - Global $\mathcal{L}^2$ minimization at uniform exponential rate via geometrically adapted gradient descent in Deep Learning [1.4050802766699084]
We consider the scenario of supervised learning in Deep Learning (DL) networks.<n>We choose the gradient flow with respect to the Euclidean metric in the output layer of the DL network.
arXiv Detail & Related papers (2023-11-27T02:12:02Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Estimating Parameterized Entanglement Measure [2.9690504594380185]
The $q$-concurrence is a reasonable parameterized entanglement measure.
We present tight lower bounds of the $q$-concurrence for arbitrary $qgeqslant 2$.
arXiv Detail & Related papers (2022-06-16T04:13:17Z) - Operator-valued formulas for Riemannian Gradient and Hessian and
families of tractable metrics [0.0]
We provide a formula for a quotient of a manifold embedded in an inner product space with a non-constant metric function.
We extend the list of potential metrics that could be used in manifold optimization and machine learning.
arXiv Detail & Related papers (2020-09-21T20:15:57Z) - Curvature-Dependant Global Convergence Rates for Optimization on
Manifolds of Bounded Geometry [6.85316573653194]
We give curvature-dependant convergence rates for weakly convex functions defined on a manifold of 1-bounded geometry.
We compute these bounds explicitly for some manifold commonly used in the optimization literature.
We present self-contained proofs of fully general bounds on the norm of the differential of the exponential map.
arXiv Detail & Related papers (2020-08-06T08:30:35Z) - Graph Metric Learning via Gershgorin Disc Alignment [46.145969174332485]
We propose a fast general projection-free metric learning framework, where the objective $min_textbfM in mathcalS$ is a convex differentiable function of the metric matrix $textbfM$.
We prove that the Gershgorin discs can be aligned perfectly using the first eigenvector $textbfv$ of $textbfM$.
Experiments show that our efficiently computed graph metric matrices outperform metrics learned using competing methods in terms of classification tasks.
arXiv Detail & Related papers (2020-01-28T17:44:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.