Geometric Meta-Learning via Coupled Ricci Flow: Unifying Knowledge Representation and Quantum Entanglement
- URL: http://arxiv.org/abs/2503.19867v1
- Date: Tue, 25 Mar 2025 17:32:31 GMT
- Title: Geometric Meta-Learning via Coupled Ricci Flow: Unifying Knowledge Representation and Quantum Entanglement
- Authors: Ming Lei, Christophe Baehr,
- Abstract summary: This paper establishes a unified framework integrating geometric flows with deep learning through three fundamental innovations.<n>First, we propose a thermodynamically coupled Ricci flow that dynamically adapts parameter space geometry to loss landscape topology.<n>Second, we derive explicit phase transition thresholds and critical learning rates through curvature blowup analysis.<n>Third, we establish an AdS/CFT-type holographic duality (Theoremrefthm:ads) between neural networks and conformal field theories.
- Score: 7.410691988131121
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: This paper establishes a unified framework integrating geometric flows with deep learning through three fundamental innovations. First, we propose a thermodynamically coupled Ricci flow that dynamically adapts parameter space geometry to loss landscape topology, formally proved to preserve isometric knowledge embedding (Theorem~\ref{thm:isometric}). Second, we derive explicit phase transition thresholds and critical learning rates (Theorem~\ref{thm:critical}) through curvature blowup analysis, enabling automated singularity resolution via geometric surgery (Lemma~\ref{lem:surgery}). Third, we establish an AdS/CFT-type holographic duality (Theorem~\ref{thm:ads}) between neural networks and conformal field theories, providing entanglement entropy bounds for regularization design. Experiments demonstrate 2.1$\times$ convergence acceleration and 63\% topological simplification while maintaining $\mathcal{O}(N\log N)$ complexity, outperforming Riemannian baselines by 15.2\% in few-shot accuracy. Theoretically, we prove exponential stability (Theorem~\ref{thm:converge}) through a new Lyapunov function combining Perelman entropy with Wasserstein gradient flows, fundamentally advancing geometric deep learning.
Related papers
- Geometric Neural Process Fields [58.77241763774756]
Geometric Neural Process Fields (G-NPF) is a probabilistic framework for neural radiance fields that explicitly captures uncertainty.<n>Building on these bases, we design a hierarchical latent variable model, allowing G-NPF to integrate structural information across multiple spatial levels.<n> Experiments on novel-view synthesis for 3D scenes, as well as 2D image and 1D signal regression, demonstrate the effectiveness of our method.
arXiv Detail & Related papers (2025-02-04T14:17:18Z) - STITCH: Surface reconstrucTion using Implicit neural representations with Topology Constraints and persistent Homology [23.70495314317551]
We present STITCH, a novel approach for neural implicit surface reconstruction of a sparse and irregularly spaced point cloud.<n>We develop a new differentiable framework based on persistent homology to formulate topological loss terms that enforce the prior of a single 2-manifold object.
arXiv Detail & Related papers (2024-12-24T22:55:35Z) - Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - Sigma Flows for Image and Data Labeling and Learning Structured Prediction [2.4699742392289]
This paper introduces the sigma flow model for the prediction of structured labelings of data observed on Riemannian manifold.
The approach combines the Laplace-Beltrami framework for image denoising and enhancement, introduced by Sochen, Kimmel and Malladi about 25 years ago, and the assignment flow approach introduced and studied by the authors.
arXiv Detail & Related papers (2024-08-28T17:04:56Z) - Gaussian Entanglement Measure: Applications to Multipartite Entanglement
of Graph States and Bosonic Field Theory [50.24983453990065]
An entanglement measure based on the Fubini-Study metric has been recently introduced by Cocchiarella and co-workers.
We present the Gaussian Entanglement Measure (GEM), a generalization of geometric entanglement measure for multimode Gaussian states.
By providing a computable multipartite entanglement measure for systems with a large number of degrees of freedom, we show that our definition can be used to obtain insights into a free bosonic field theory.
arXiv Detail & Related papers (2024-01-31T15:50:50Z) - Physics-informed neural networks for transformed geometries and
manifolds [0.0]
We propose a novel method for integrating geometric transformations within PINNs to robustly accommodate geometric variations.
We demonstrate the enhanced flexibility over traditional PINNs, especially under geometric variations.
The proposed framework presents an outlook for training deep neural operators over parametrized geometries.
arXiv Detail & Related papers (2023-11-27T15:47:33Z) - Message-Passing Neural Quantum States for the Homogeneous Electron Gas [41.94295877935867]
We introduce a message-passing-neural-network-based wave function Ansatz to simulate extended, strongly interacting fermions in continuous space.
We demonstrate its accuracy by simulating the ground state of the homogeneous electron gas in three spatial dimensions.
arXiv Detail & Related papers (2023-05-12T04:12:04Z) - A Functional-Space Mean-Field Theory of Partially-Trained Three-Layer
Neural Networks [49.870593940818715]
We study the infinite-width limit of a type of three-layer NN model whose first layer is random and fixed.
Our theory accommodates different scaling choices of the model, resulting in two regimes of the MF limit that demonstrate distinctive behaviors.
arXiv Detail & Related papers (2022-10-28T17:26:27Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Neural Mechanics: Symmetry and Broken Conservation Laws in Deep Learning
Dynamics [26.485269202381932]
Understanding the dynamics of neural network parameters during training is one of the key challenges in building a theoretical foundation for deep learning.
We show that any such symmetry imposes stringent geometric constraints on gradients and Hessians, leading to an associated conservation law.
We apply tools from finite difference methods to derive modified gradient flow, a differential equation that better approximates the numerical trajectory taken by SGD at finite learning rates.
arXiv Detail & Related papers (2020-12-08T20:33:30Z) - The Heavy-Tail Phenomenon in SGD [7.366405857677226]
We show that depending on the structure of the Hessian of the loss at the minimum, the SGD iterates will converge to a emphheavy-tailed stationary distribution.
We translate our results into insights about the behavior of SGD in deep learning.
arXiv Detail & Related papers (2020-06-08T16:43:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.