Constant Metric Scaling in Riemannian Computation
- URL: http://arxiv.org/abs/2601.10992v1
- Date: Fri, 16 Jan 2026 04:54:23 GMT
- Title: Constant Metric Scaling in Riemannian Computation
- Authors: Kisung You,
- Abstract summary: We show how constant metric scaling can be introduced without altering the geometric structures on which these methods rely.<n>The goal of this note is purely expository and is intended to clarify how a global metric scale parameter can be introduced without altering the geometric structures on which these methods rely.
- Score: 0.31727619150610836
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Constant rescaling of a Riemannian metric appears in many computational settings, often through a global scale parameter that is introduced either explicitly or implicitly. Although this operation is elementary, its consequences are not always made clear in practice and may be confused with changes in curvature, manifold structure, or coordinate representation. In this note we provide a short, self-contained account of constant metric scaling on arbitrary Riemannian manifolds. We distinguish between quantities that change under such a scaling, including norms, distances, volume elements, and gradient magnitudes, and geometric objects that remain invariant, such as the Levi--Civita connection, geodesics, exponential and logarithmic maps, and parallel transport. We also discuss implications for Riemannian optimization, where constant metric scaling can often be interpreted as a global rescaling of step sizes rather than a modification of the underlying geometry. The goal of this note is purely expository and is intended to clarify how a global metric scale parameter can be introduced in Riemannian computation without altering the geometric structures on which these methods rely.
Related papers
- Beyond Optimization: Intelligence as Metric-Topology Factorization under Geometric Incompleteness [6.0044467881527614]
We argue that intelligence is not navigation through a fixed maze, but the ability to reshape representational geometry so desired behaviors become stable attractors.<n>We show any fixed metric is geometrically incomplete: for any local metric representation, some topological transformations make it singular or incoherent.<n>We introduce the Topological Urysohn Machine (TUM), implementing MTF through memory-amortized metric inference.
arXiv Detail & Related papers (2026-02-08T13:59:22Z) - Riemannian Zeroth-Order Gradient Estimation with Structure-Preserving Metrics for Geodesically Incomplete Manifolds [57.179679246370114]
We construct metrics that are geodesically complete while ensuring that every stationary point under the new metric remains stationary under the original one.<n>An $$-stationary point under the constructed metric $g'$ also corresponds to an $$-stationary point under the original metric $g'$.<n>Experiments on a practical mesh optimization task demonstrate that our framework maintains stable convergence even in the absence of geodesic completeness.
arXiv Detail & Related papers (2026-01-12T22:08:03Z) - Follow the Energy, Find the Path: Riemannian Metrics from Energy-Based Models [63.331590876872944]
We propose a method for deriving Riemannian metrics directly from pretrained Energy-Based Models.<n>These metrics define spatially varying distances, enabling the computation of geodesics.<n>We show that EBM-derived metrics consistently outperform established baselines.
arXiv Detail & Related papers (2025-05-23T12:18:08Z) - Riemann$^2$: Learning Riemannian Submanifolds from Riemannian Data [12.424539896723603]
Latent variable models are powerful tools for learning low-dimensional manifold from high-dimensional data.<n>This paper generalizes previous work and allows us to handle complex tasks in various domains, including robot motion synthesis and analysis of brain connectomes.
arXiv Detail & Related papers (2025-03-07T16:08:53Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - The Geometry of Neural Nets' Parameter Spaces Under Reparametrization [35.5848464226014]
We study the invariance of neural nets under reparametrization from the perspective of Riemannian geometry.
We discuss implications for measuring the flatness of minima, optimization, and for probability-density.
arXiv Detail & Related papers (2023-02-14T22:48:24Z) - Identifying latent distances with Finslerian geometry [6.0188611984807245]
Generative models cause the data space and the geodesics to be at best impractical, and at worst impossible to manipulate.
In this work, we propose another metric whose geodesics explicitly minimise the expected length of the pullback metric.
In high dimensions, we prove that both metrics converge to each other at a rate of $Oleft(frac1Dright)$.
arXiv Detail & Related papers (2022-12-20T05:57:27Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Operator-valued formulas for Riemannian Gradient and Hessian and
families of tractable metrics [0.0]
We provide a formula for a quotient of a manifold embedded in an inner product space with a non-constant metric function.
We extend the list of potential metrics that could be used in manifold optimization and machine learning.
arXiv Detail & Related papers (2020-09-21T20:15:57Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.