Local-Curvature-Aware Knowledge Graph Embedding: An Extended Ricci Flow Approach
- URL: http://arxiv.org/abs/2512.07332v2
- Date: Wed, 10 Dec 2025 12:07:34 GMT
- Title: Local-Curvature-Aware Knowledge Graph Embedding: An Extended Ricci Flow Approach
- Authors: Zhengquan Luo, Guy Tadmor, Or Amar, David Zeevi, Zhiqiang Xu,
- Abstract summary: Knowledge graph embedding relies on the geometry of the embedding space to encode semantic and structural relations.<n>We propose RicciKGE to have the KGE gradient loss coupled with local curvatures in an extended Ricci flow.<n> Experimental improvements on link prediction and node classification benchmarks demonstrate RicciKGE's effectiveness in adapting to heterogeneous knowledge graph structures.
- Score: 4.686364613477057
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graph embedding (KGE) relies on the geometry of the embedding space to encode semantic and structural relations. Existing methods place all entities on one homogeneous manifold, Euclidean, spherical, hyperbolic, or their product/multi-curvature variants, to model linear, symmetric, or hierarchical patterns. Yet a predefined, homogeneous manifold cannot accommodate the sharply varying curvature that real-world graphs exhibit across local regions. Since this geometry is imposed a priori, any mismatch with the knowledge graph's local curvatures will distort distances between entities and hurt the expressiveness of the resulting KGE. To rectify this, we propose RicciKGE to have the KGE loss gradient coupled with local curvatures in an extended Ricci flow such that entity embeddings co-evolve dynamically with the underlying manifold geometry towards mutual adaptation. Theoretically, when the coupling coefficient is bounded and properly selected, we rigorously prove that i) all the edge-wise curvatures decay exponentially, meaning that the manifold is driven toward the Euclidean flatness; and ii) the KGE distances strictly converge to a global optimum, which indicates that geometric flattening and embedding optimization are promoting each other. Experimental improvements on link prediction and node classification benchmarks demonstrate RicciKGE's effectiveness in adapting to heterogeneous knowledge graph structures.
Related papers
- Riemannian Flow Matching for Disentangled Graph Domain Adaptation [51.98961391065951]
Graph Domain Adaptation (GDA) typically uses adversarial learning to align graph embeddings in Euclidean space.<n>DisRFM is a geometry-aware GDA framework that unifies embedding and flow-based transport.
arXiv Detail & Related papers (2026-01-31T11:05:35Z) - Gauge-invariant representation holonomy [1.078600700827543]
Deep networks learn internal representations whose geometry--how features bend, rotate, and evolve--affects both generalization and robustness.<n>Existing similarity measures such as CKA or SVCCA capture pointwise overlap between activation sets, but miss how representations change along input paths.<n>We introduce representation holonomy, a gauge-invariant statistic that measures this path dependence.
arXiv Detail & Related papers (2026-01-29T12:51:17Z) - Adaptive Riemannian Graph Neural Networks [29.859977834688625]
We introduce a novel framework that learns a continuous and anisotropic metric tensor field over the graph.<n>It allows each node to determine its optimal local geometry, enabling the model to fluidly adapt to the graph's structural landscape.<n>Our method demonstrates superior performance on both homophilic and heterophilic benchmark geometries.
arXiv Detail & Related papers (2025-08-04T16:55:02Z) - Learning Latent Graph Geometry via Fixed-Point Schrödinger-Type Activation: A Theoretical Study [1.1745324895296467]
We develop a unified theoretical framework for neural architectures with internal representations evolving as stationary states of dissipative Schr"odinger-type dynamics on learned latent graphs.<n>We prove existence, uniqueness, and smooth dependence of equilibria, and show that the dynamics are equivalent under the Bloch map to norm-preserving Landau--Lifshitz flows.<n>The resulting model class provides a compact, geometrically interpretable, and analytically tractable foundation for learning latent graph geometry via fixed-point Schr"odinger-type activations.
arXiv Detail & Related papers (2025-07-27T00:35:15Z) - CurvGAD: Leveraging Curvature for Enhanced Graph Anomaly Detection [23.643189106137008]
We propose CurvGAD - a mixed-curvature graph autoencoder that introduces the notion of curvature-based geometric anomalies.<n>CurvGAD introduces two parallel pipelines for enhanced anomaly interpretability.<n>Experiments over 10 real-world datasets demonstrate an improvement of up to 6.5% over state-of-the-art GAD methods.
arXiv Detail & Related papers (2025-02-12T17:49:46Z) - Bridging Geometric States via Geometric Diffusion Bridge [79.60212414973002]
We introduce the Geometric Diffusion Bridge (GDB), a novel generative modeling framework that accurately bridges initial and target geometric states.
GDB employs an equivariant diffusion bridge derived by a modified version of Doob's $h$-transform for connecting geometric states.
We show that GDB surpasses existing state-of-the-art approaches, opening up a new pathway for accurately bridging geometric states.
arXiv Detail & Related papers (2024-10-31T17:59:53Z) - Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - Curve Your Attention: Mixed-Curvature Transformers for Graph
Representation Learning [77.1421343649344]
We propose a generalization of Transformers towards operating entirely on the product of constant curvature spaces.
We also provide a kernelized approach to non-Euclidean attention, which enables our model to run in time and memory cost linear to the number of nodes and edges.
arXiv Detail & Related papers (2023-09-08T02:44:37Z) - kHGCN: Tree-likeness Modeling via Continuous and Discrete Curvature
Learning [39.25873010585029]
This study endeavors to explore the curvature between discrete structure and continuous learning space, aiming at encoding the message conveyed by the network topology in the learning process.
A curvature-aware hyperbolic graph convolutional neural network, kappaHGCN, is proposed, which utilizes the curvature to guide message passing and improve long-range propagation.
arXiv Detail & Related papers (2022-12-04T10:45:42Z) - Shape And Structure Preserving Differential Privacy [70.08490462870144]
We show how the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
We also show how using the gradient of the squared distance function offers better control over sensitivity than the Laplace mechanism.
arXiv Detail & Related papers (2022-09-21T18:14:38Z) - Heterogeneous manifolds for curvature-aware graph embedding [6.3351090376024155]
Graph embeddings are used in a broad range of Graph ML applications.
The quality of such embeddings crucially depends on whether the geometry of the space matches that of the graph.
arXiv Detail & Related papers (2022-02-02T18:18:35Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.